The Journey of Nvidia’s Dominance in the AI Industry

Lauren Goode, Michael Calore, Will Knight

Unless you were really into desktop PC gaming a decade ago, you probably didn’t give Nvidia much thought until recently. The company makes graphics cards, among other tech, and has earned great success thanks to the strength of the gaming industry. But that’s been nothing compared to the explosive growth Nvidia has enjoyed over the past year. That’s because Nvidia’s tech is well-suited to power the machines that run large language models, the basis for the generative AI systems that have swept across the tech industry. Now Nvidia is an absolute behemoth, with a skyrocketing stock value and a tight grip on the most impactful—and controversial—tech of this era.

This week on Gadget Lab, we welcome WIRED’s Will Knight, who writes about AI, as our guest. Together, we boot up our Nvidia® GeForce RTX™ 4080 SUPER graphics cards to render an ultra high-def conversation about the company powering the AI boom.

Read Lauren’s interview with Nvidia cofounder and CEO, Jensen Huang. Read Will’s story about the need for more chips in AI computing circles, and his story about the US government’s export restrictions on chip technology. Read all of our Nvidia coverage.

Will recommends WhisperKit from Argmax for machine transcription. Mike recommends getting your garden going now; it’s almost spring. Lauren recommends Say Nothing, a book by Patrick Radden Keefe.

Will Knight can be found on social media @willknight Lauren Goode is @LaurenGoode. Michael Calore is @snackfight. Bling the main hotline at @GadgetLab. The show is produced by Boone Ashworth (@booneashworth). Our theme music is by Solar Keys.

You can always listen to this week’s podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here’s how:

If you’re on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts, and search for Gadget Lab. If you use Android, you can find us in the Google Podcasts app just by tapping here. We’re on Spotify too. And in case you really need it, here’s the RSS feed.

Note: This is an automated transcript, which may contain errors.

Lauren Goode: Mike.

Michael Calore: Lauren.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Lauren Goode: Let’s go back about 10 years. When you thought of Nvidia back then, what did you think of?

Michael Calore: I think of big CES press conferences with the company talking about things like Tegra supercomputing chips and these big events that generally just served word soup.

Lauren Goode: That is very accurate. Can you guess what the stock price of Nvidia was then?

Michael Calore: I have no idea.

Lauren Goode: Are you ready for it?

Michael Calore: Yeah.

Lauren Goode: It was between $3 and $5.

Michael Calore: What is it now?

Lauren Goode: It’s hovering around $800.

Michael Calore: Oh, my God. Stop.

Lauren Goode: Mm-hmm. Seriously.

Michael Calore: Well, we don’t own tech stocks here, so sad for us. But what happened to Nvidia?

Lauren Goode: Basically, Nvidia started to take over the computing world.

Michael Calore: OK, we need to talk about why.

Lauren Goode: We really do. Let’s do it.

[Gadget Lab intro theme music plays]

Lauren Goode: Hi, everyone. Welcome to Gadget Lab. I’m Lauren Goode. I’m a senior writer at WIRED.

Michael Calore: And I’m Michael Calore. I am WIRED’s director of Consumer Tech and Culture.

Lauren Goode: We’re joined this week by WIRED senior writer, Will Knight, who joins us from Cambridge, Massachusetts. He’s on Zoom and he has averted his eyes from the latest AI research paper to humor us on the Gadget Lab. Hi, Will.

Will Knight: Hello.

Lauren Goode: Thanks for being here.

Will Knight: Thanks for having me.

Lauren Goode: OK. We invited Will here today to delve into the astounding ascent of Nvidia, the firm that made its debut in the 1990s by vending graphics chips for PC video games. To a certain extent, this is an oversimplification, but the crux of the matter is that, from the beginning, Nvidia made a wager on accelerated computing in lieu of general-purpose computing. The enterprise devised special chips that supercharged the capacities of PCs. However, the Nvidia of the present is much different from your typical Gen X graphic chips producer. Its co-founder and CEO, Jensen Huang, has unfailingly situated the business right before the technological wave. Presently, Nvidia boasts the dominant share of the AI computing chips market and is valued at nearly $2 trillion.

I had an opportunity to interview Jensen recently for a WIRED feature. Sadly, you won’t be able to listen to these dialogues here. You will have to look them up in WIRED. I would also suggest that you pay a visit to the Acquired Podcast for an extremely exhaustive multi-episode series on Nvidia, which concludes with a chat with Jensen. However, our objective here is to present you with the most succinct narrative of Nvidia’s trajectory and its forthcoming plans.

Perhaps, we should commence our discussion by shedding some light on Nvidia’s nascent years and the personal computer era of the ’90s, and how everything transitioned from that point onwards, correct?

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Michael Calore: Yes. The company started in 1993, you said?

Lauren Goode: That’s correct. It’s just about as old as WIRED.

Michael Calore: What was their big breakthrough that put them on the map?

Lauren Goode: Well, we should go back to what PCs were like in the 1990s and specifically what games on PCs were like. Games were starting to become more popular, but they were powered by CPUs, central processing units, and then graphics were kind of like an also-ran. These chips could also do graphics, but they weren’t very good. And then Nvidia had this idea to move more towards a customized or specialized unit, a graphics processing unit, which is how we get GPU. A lot of our listeners are going to know what this means, CPU and GPU and the differences, but other people just probably hear these acronyms all the time and don’t fully understand what they mean and how they power a computer.

At the time Nvidia was first conceived, Jensen Huang was employed at LSI Logic. He was persuaded by two of his acquaintances, who were also his co-founders, to initiate a dedicated graphics card company. Agreeing to their proposal, Jensen left his then-job and together, they launched Nvidia. There’s a story that the whole concept was initially devised at a Denny’s diner. This marked the inception of Nvidia in 1993.

Michael Calore: The time we all used to spend in Denny’s diners.

Lauren Goode: I reckon some people still visit. Good on them.

Michael Calore: How was the trajectory? Did it unfold as planned? Were there any obstacles?

Lauren Goode: There were definitely some bumps in the road in the mid to late-1990s. One of the first chips that they put out was pretty much a failure and the company nearly went bankrupt. They had to lay off a lot of people, something like close to 70 percent of the staff, and then they had to come up with a plan for a new chip super, super quickly. But a lot of tech production cycles, they’re 18 to 24 months, and that was certainly the case for chips. Nvidia didn’t have that much runway. They also didn’t have their own fab, which is a place where you make all the chips. You’re relying on partners to do this. You’re working with them hand in hand on the designs and then they’re sending chips to you and you’re making tweaks and sending it back and stuff. It’s a long process.

Nvidia did something smart, which is they started using emulators. They started crafting this second ship that they had to launch very quickly in software and testing it that way. They were able to spin up something new within… I think it was six-ish months called the RIVA 128. That basically saved the company in its earliest days. There have been a few moments like that in Nvidia’s history where they’ve made a bet on this next thing that’s going to happen and you’re betting the farm and so far they’ve just managed to grow the farm. It’s like big ag now. It’s massive.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Michael Calore: I want to ask Will a question about what a lot of people consider to be the birth of this modern era of AI, which Nvidia is a big part of. Can you tell us about what happened roughly 10, 12 years ago in the world of artificial intelligence computing?

Will Knight: Yeah, sure. Nvidia’s very much wound up in the origins of modern AI and we forget now because everything’s machine learning based and AI algorithms so capable. But back before 2010, 2012, people were coding all these things by hand to try and have machines do more intelligent tasks. There was a group of people who focused on this neural network approach, which is totally out of vogue. It hadn’t worked. It had been too puny to do anything very impressive and they kept going on it. Around 2010, there was this conference of enough data from the internet, these bigger neural network algorithms, which was possible to run on GPUs because they were very paralleled and the computations you want to do are inherently parallel.

Deep learning researchers discovered GPUs could significantly enhance their algorithms. In 2012, a competition was held to determine who could write the best image classification algorithm. The deep learning participants, who were utilizing GPUs, markedly outperformed the competition. This wasn’t necessarily anticipated by Nvidia, but their chips proved perfect for the task, leading to further exploration.

Lauren Goode: And then in 2017, there was another turning point when Google published the Transformer paper. This was another major impetus for the new age of AI, with Nvidia playing a role here too.

Will Knight: Absolutely. The Transformer paper introduced a novel method for implementing machine learning, particularly in the field of language. It was extraordinarily effective and it’s what has given us these current language models with their incredible capabilities. Prior to the Transformer paper, the emphasis was on image recognition and voice recognition, with Nvidia deserving much credit for developing the tools that many people embraced. When the Transformer era began, it heralded the start of this generative AI language model and chatbot phase, which has since gained a great deal of momentum and extensively expanded.

Michael Calore: The industry is currently exploding, with startups and established businesses vying for computing power. Nvidia GPUs became notoriously difficult to source for several years as they were also being utilized for bitcoin and cryptocurrency mining, at one point making them more valuable than gold. The company is still grappling with this supply chain issue, trying to saturate the market with chips. However, from your conversation with Jensen, Lauren, it seems that his primary hardware focus for the future is on enabling large-scale computing – establishing data centers solely for AI and creating sizeable appliances for companies to conduct their own on-site AI computations.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Lauren Goode: Yeah. Nvidia’s data center business is already quite significant, but when we first discussed this topic in early January, it was one of his primary focuses. He had just completed a conversation with a technology partner about AI data centers. In a nutshell, the computing industry has transitioned from on-device computing to cloud computing. On-device computing pertains to the bygone era of graphics cards crammed into your PC, handling most or all of the accelerated computing. With the emergence of cloud computing options from major tech companies like Google and Amazon, much of this computation process has shifted to the cloud. Microsoft also plays a role in this. User inputs are sent from the computer to a server elsewhere, and the resulting output is then dispatched back to the user’s device. This is a key factor that enables software to “scale” rapidly. This concept of scale is quite popular among techies and venture capitalists. Everyone appreciates scalability.

Michael Calore: Scale, that’s where the profits are.

Lauren Goode: Exactly. The quest is always for scaling up. Even WIRED may need to augment its scale these days. Nvidia plays a role in this scenario, and it now aims to create more of these AI supercomputing data centers. These will not only facilitate software companies but also manufacturers, self-driving car initiatives, and biotech companies that are increasingly incorporating AI into their operations.

Michael Calore: Nice.

Will Knight: The concept of an AI factory that you mentioned during the interview was extremely intriguing. This likely mirrors the momentum we’re experiencing with AI, where significant contributors such as Google, Meta, and Microsoft that are developing the algorithms have commenced the construction of their unique chips. Something they can do to potentially overshadow Nvidia or give them an edge is… They already construct these data centres, linking all the chips, and he also spoke about that particular networking company. However, I believe the idea of creating their enormous data centre with a primary focus on enhancing the networking and other facilities is in the offing.

This suggests to a certain extent that, due to the sheer size of models at hand, it becomes necessary to have tens of thousands of chips linked as optimally as feasible, something that was unthinkable a decade earlier. Despite this, I am convinced that he is astoundingly adept at predicting what could potentially pose a threat as well as offering a prospect for delivering that AI to a spectrum of clients. Huge entities like Google will possibly attempt to cater to these manufacturers, automobile companies, and the like. I found that AI factory concept to be absolutely intriguing.

Lauren Goode: Yes, Will does bring forth a thought-provoking point. Nvidia’s acquisition of Mellanox was definitely a tactful move, as it extended the access to networking technology at the chip level. However, a number of the companies mentioned by Will are also multiply resourced and might possess the technological prowess needed to envision something comparable, build something akin, and they are definitely making an effort. Time for a short important pause and we will continue our discussion on Nvidia’s competitors shortly after.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

[Break]

Lauren Goode: Currently, Nvidia dominates the AI sector. Its much sought-after supercomputing GPUs and its strategic investments in the field are clear indications of its strength in this space. Moreover, its proprietary programming model, Cuda, and heavily loaded data centers further consolidate its leadership position. However, major tech companies like Google, Amazon, Microsoft, and Meta are also vying for supremacy in this arena using their enormous financial resources. Will, can you name a company that could potentially surpass Nvidia?

Will Knight: As I see it, Google can be a massive challenger. Google’s development of its own AI chips and its pivotal role in AI software is a testament to this. Although Nvidia’s chips are the most potent ones around, multiple lesser-powerful chips networked together can pose a serious challenge, especially if they have minimal interaction bottlenecks. Google’s recent experiment showcased this, where it networked together 50,000 GPUs for language model training. This is a considerable bottleneck that significantly enhances the overall supercomputer size by compensating for the less advanced chips. Google focuses on developing its own chips and using optical networking between these chips to improve this aspect. While other companies like Microsoft are working on their AI chips, and a slew of start-ups like Cerebras and others are exploring unconventional routes, I believe Google is Nvidia’s most significant competitor.

Indeed, it was rather interesting when Google excluded Nvidia from their announcement of its Gemini AI models, considering they are now dependent on their tensor processing units, otherwise known as TPUs.

Michael Calore: Really? I’m going to bet on AWS as the dark horse here, Amazon Web Services. Obviously, they’ve been working on this for a long time and they don’t have as much to show for it, but I think that’s holding their cards to lay them down at the right time.

Lauren Goode: Interesting.

Michael Calore: Yeah.

Will Knight: There was a very interesting announcement where Anthropic, one of these competitors to OpenAI, which was founded by people from that company, got a huge investment from Amazon AWS. Part of that deal was that they were going to run their next model, their next competitor to GPT-4 or 5 on Amazon silicon. That will show that their silicon is competitive and that’s going to be one of the ways they try and sell that to all their customers.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Michael Calore: Yeah.

Lauren Goode: Will, where do AMD and Intel fit into all of this?

Will Knight: Yeah. I think AMD’s come out recently with some chips that are more competitive and likely to be bit more of an option for people developing models. Intel… I mean, it is trying to get back into the game of making these but it’s it very behind. But it’s got a huge amount of money from the US government to try and improve what it’s doing, so that could include making chips for Nvidia. They want to do that. They want to try and have this cutting edge process, but they’re also looking to develop their own chips somewhere down the line. I think they’re a long way off.

Lauren Goode: They’ve been on a bit of a press blitz recently. You wrote about them in WIRED.com last week.

Will Knight: Yeah. Well, they are receiving a huge injection of cash from the CHIPS Act. The US government is trying to ensure that America remains competitive when it comes to chip production. This is due to their concerns over the supply chain and possible future difficulties if access to TSMC or Samsung was cut off. Therefore, they are determined to make a resurgence. Recently, Intel has started producing chips very similar to those of TSMC. They’ve been following a very aggressive approach and announced that Microsoft will manufacture its AI chips on its platform. However, it’s yet to see how successful this comeback will be.

Michael Calore: I’m interested to know about the other side of the coin. While the US government is supporting local chip factories and promoting more technology development in the United States, they’ve also imposed export controls on that technology which prevents sharing our latest technological breakthroughs with other countries – primarily China. Can one of you elaborate on what impact this is having on the industry?

Lauren Goode: Certainly, I can speak for Nvidia and perhaps Will can shed some light on the wider industry. Nvidia has been impacted by these export controls which were first announced in August of 2022. There have been some updates to the export controls, but essentially Nvidia had to modify or adjust its chips so they complied with these controls and could still be shipped to China—a market of vital importance for them as well as many other tech companies. The objective here is to ensure that the US and other western countries have access to the most advanced technology and we’re not simply handing this technology over to China. Nvidia wanted to continue sales in that market, but they had to adapt. They had to change their formula to make sure they weren’t selling the very best technology there. This also has affected their data center business. Will, do you have an understanding of how this is influencing the wider chip industry?

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Will Knight: I think, it’s one of the more fascinating moves in tech policy that the US government has made certainly in recent years, because they’re basically cutting off one of the most lucrative and important industry’s access to the biggest market and the fastest growing market to try and maintain this technological edge. People within the chip industry are quite worried, I think, rightly that China is going to simply encourage Chinese companies to be more competitive and to gain more of an edge. The whole rational is that it’s so difficult to do chip making and all of the components and technologies and hardware you need come from American allies or America, so you can limit that. But there have been some recent moves to suggest that China may be moving more quickly in developing more cutting edge chips.

Lauren Goode: A question for you both. It’s clear that AI isn’t perfect, particularly in the realm of generative AI. It can be as minor as getting a simple math problem wrong, or as severe as a self-driving car failing to identify a pedestrian in its path, or even be involved in a false drone attack. These are grave concerns and I’m curious how this affects the public view of AI and the companies behind it, especially when things go wrong. Nvidia for instance, can state they merely provide hardware even though they have a platform like Cuda that developers are invested in. But for companies like Google, Microsoft, or Amazon, who own the entire AI computing stack, what is the aftermath when things go wrong?

Will Knight: Your point touches on how machine learning systems can fail in unique ways. For instance, in the self-driving cars field, pinpointing the cause of an error becomes challenging with machine learning processes that are innately probabilistic, alongside coded and hardware rules. If the fault traced back to a flawed chip component, blame might be allocated there. However, an ongoing challenge in industries that heavily rely on AI is understanding the origin of errors and determining system reliability. These aren’t normal engineering tasks since these systems don’t operate the same way each time. So, engineers need to work around this.

Michael Calore: What interests me is that the current ground-level revolution is all about chatbots. People are excited about chatbots and their potential for societal integration. They are highly interactive. People converse with their phones and with customer service agents who may or may not be human. Having a virtual interaction with an AI version of one’s favorite celebrity feels like stepping into the future. The significant shift will come when we learn that some oppressive systems, such as banking and college admissions…

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Lauren Goode: Housing, job applications, healthcare, yep.

Michael Calore: Law enforcement. When those systems become more and more reliant on machine intelligence and we notice that the oppression is staying the same or getting worse and it is not really helping us, that our perception of these things are the future, cool, will change to, “These things are the future. This sounds terrible.” I don’t really know what to say about hallucinations, because I think you really have to be embedded in the tools and really using the tools in order to see those smaller, more nuanced problems, and to gain an understanding of them in most people. By most people, I mean like 95 percent of the people out there are not deep in those tools and they’re not super familiar with them, but they are encountering them whether they know it or not. I think the next couple of years are going to be wild.

Lauren Goode: There’s just so much potential for additional layers of obfuscation.

Michael Calore: Yes.

Lauren Goode: Not being able to actually pinpoint where the liability should.

Michael Calore: Yes. Yes. Really, I think it’s going to… People are going to get even more upset at tech companies and billionaires and all the things that they’re upset about now.

Lauren Goode: Does that upset you, Mike? I know you love billionaires.

Michael Calore: I love them as people, yes. They’re great people. I’m sure some of them.

Will Knight: I’ve been thinking about this in the context of self-driving cars, because on the one hand I think it’s really shocking that—In some ways, it’s really insane that they test these experimental vehicles on the roads with pedestrians who haven’t signed up for it at all. But then on the other hand, 35,000 people die a year because of terrible human drivers. We haven’t had any kind of public discourse about that. It’s just being done by the companies that have the most money, but, I mean, it is a really interesting thing to think about how to weigh those things.

Lauren Goode: It’s intriguing to ponder the overlooked aspects. For instance, one of the hardware firms that’s powering autonomous vehicles, told me that after they began deploying the cars, they realized they hadn’t tested for events like Halloween. Precisely, they hadn’t examined scenarios where individuals are crossing the streets on a dark night donning various types of attire that could disguise them as non-humans. They had to revisit and conduct these tests.

Michael Calore: Are we dealing with a pantomime horse controlled by two people, or a real horse without any human beings?

Lauren Goode: Exactly, could that potentially be Batman?

Michael Calore: Yes, indeed.

Lauren Goode: All right. Well, we’re not going to be able to answer all of these questions on this week’s podcast, but hopefully we gave you a good primer on Nvidia and where it’s going. Will, we very much appreciate you being a part of that. We’re going to take another quick break and then come back with all of our recommendations.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

[Break]

Lauren Goode: All right, Will. What is your recommendation?

Will Knight: My recommendation is this application called WhisperKit, which is from a company called Argmax, which was founded by some Apple developers who left to do their own thing. I think it’s appropriate because it’s a good example of the importance of the edge. This isn’t like sending yourself to the cloud. You can do quite advanced voice transcription, which is obviously important for journalists and other people on your computer using… They use some software that came from OpenAI, but they just optimized it very much for your own hardware. It’s a good example of how… Maybe a lot of AI is also going to happen on the edge, as well as in the cloud.

Lauren Goode: And what are you using it for?

Will Knight: Recording everybody.

[Lauren and Michael laugh]

Michael Calore: Your own home speech recognition applications that you’re running?

Will Knight: Yeah, just all the time. I can just remember everything. I can just settle arguments by winding back the tape and playing what people said.

Lauren Goode: How many arguments do you get into? Will, you don’t strike me as argumentative.

Will Knight: You’d be surprised, but, no, I don’t really use that. I was using one of these cloud platforms for transcription, but I wanted something that wasn’t… I kept running into the limit of how much I could record, which actually is very annoying and they’re quite expensive. I figured it’d be interesting to play with this for transcribing interviews.

Michael Calore: And pretty accurate?

Will Knight: Yeah, it’s pretty accurate. I use this thing called Whisper from OpenAI, which is pretty good. Yeah, you have to go back, make sure you’re not misquoting people, but, yeah, pretty good.

Lauren Goode: How secure is it? Would you use it to process your most sensitive interviews?

Will Knight: Well, it’s all running on my computer. Assuming my computer hasn’t been hacked, which is never a given, it certainly seems more secure than sending it to the cloud.

Lauren Goode: Interesting. Yeah. I use Google’s transcription service, which is pretty darn good. It does it on device, but then I do send it to the cloud. Use Otter, right?

Michael Calore: I use Mechanical Turk. No, I’m kidding. I use Alice.

Lauren Goode: What’s that?

Michael Calore: Alice AI. It’s another one of the front ends for… I think they use Google’s transcription service. But also, I have a Pixel phone, so if I record on my Pixel phone then it just freely translates it.

Lauren Goode: Right. My second phone’s a pixel for all my shady activity.

Michael Calore: Yep.

Lauren Goode: Thank you for that, Will. We’re going to link that in the show notes. Mike, what’s your recommendation?

Michael Calore: Get your garden ready. This is my recommendation. This week, February turns into March and in just a couple more weeks it will be spring. Spring will have sprung and it is time to plant the vegetables and the fruits and the flowers that you would like to be eating this summer. If you live in a slightly warmer part of the country like we do here in California, or if you live in the south, then you can start planting outdoors pretty much now or next week. If you live in a colder part, you will have to use a greenhouse or you can do like I did and get a seedling mat, which is like a heated mat that you put your seedlings on. You can leave them on your enclosed porch or in your basement or your attic and make sure that you can grow healthy plants.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

I, like many people, got into gardening a little bit more than usual during the pandemic because all of a sudden I had all this time at home that I could pay attention to the plants. Now that I’m in the office most of the days, my garden has gone into disarray. There’s a lot of weeds. I’ve switched to succulents for most of it, but I am determined to do some California wildflowers this year and to do some peppers that we can all enjoy either fresh or pickled this summer. I would say that if you are a person who has always thought about gardening or if you used to be a serious gardener, this is a big reminder. It’s a big flashing light sign that now is the time to plant again.

Lauren Goode: This is a great recommendation.

Michael Calore: Thank you. Are you going to plant anything?

Lauren Goode: I have already started with the plants. I recently picked up some dichondra silver, some lotus. I have Angel vine going pretty strong right now under our skylights in the kitchen.

Michael Calore: Can you eat any of these things?

Lauren Goode: No, but I have one plant… You know this plant, Kevin. It was rescued from a demolition site during the pandemic. My neighbor gave me this scrawny little thing in a pot and it’s a lemon tree. And I grew it. I mean, it’s really healthy now. It’s attached to a lattice and, yeah, I water it, I give it plant food. It provided a lot of lemons last year and I’m already seeing… The green ones are there now. What is that called when it’s not ripe? Unripe. Yeah. Kevin the Lemon Tree is doing… Actually, our friend Cyra and Adrian named it when they were drunk one night like, “OK, it’s Kevin.” Kevin’s doing great. Everyone come over for lemons.

Michael Calore: Nice.

Lauren Goode: But, yeah, no, I love it. It’s great. There are some other plant. There’s a Japanese maple in front of my place, but I don’t have to water that or anything. The Japanese maple houses the little nest from the hummingbirds last year.

Michael Calore: OK. Lauren, I just have to point out that all these things that you’re mentioning are plants and trees.

Lauren Goode: I know. I know. But you’re talking about planting. They are plants. You plant plants.

Michael Calore: This year, try herbs, charred, leafy greens, peppers. Get them going.

Lauren Goode: OK. My brother actually got me an herb garden from Christmas and I sent it back.

Michael Calore: Is it like one of the Click & Grow ones?

Lauren Goode: Yeah, it was like one of those indoor ones and I didn’t want it inside, but maybe I’ll do it outside.

Michael Calore: Does your brother listen to the show and will now be offended?

Lauren Goode: He does. Hi, Gerald. Yeah, he does. He listened to the… Yes, he listened to the episode where we’re talking about the Bono book.

Aarian Marshall

Simon Hill

Julian Chokkattu

David Nield

Michael Calore: Oh, yes, yes.

Lauren Goode: Yeah. Yeah

Michael Calore: OK. Another gift that you returned?

Lauren Goode: Yeah.

Michael Calore: Before we go too far into plantasia, what is your recommendation?

Lauren Goode: Plantasia?

Michael Calore: What is your recommendation?

Lauren Goode: My recommendation is a book that is soon spawning a television show. Although I did just look it up, I thought the TV show was coming out soon and there’s no release date for it yet. It’s later in 2024. The book is called Say Nothing. It’s by our colleague at The New Yorker, Patrick Radden Keefe. Mike is smiling right now because we were saying earlier how saying the phrase my colleague can be so self-aggrandizing because I don’t actually know Patrick Radden Keefe. I admire his writing, a great deal.

Michael Calore: He used to write for WIRED.

Lauren Goode: I read more than one of his books. You told me that this morning and I was very excited. He worked… What is it called, Danger Zone?

Michael Calore: Danger Room. It was our defense tech section.

Lauren Goode: Yeah. How long ago was this?

Michael Calore: I don’t know. 2008, 2009 probably.

Lauren Goode: I mean, he’s in our Slack and we’re talking about him right now, but I’ve never met him. I think his writing is brilliant. I’ve read at least a couple of his books. Say Nothing is about the troubles in Ireland and it’s multiple stories interwoven. It’s one of the best books I’ve read in 2023 by far. I read it towards the end of the year, so it’s still fresh in my mind. It debuted earlier, around 2019, and has now been adapted into a TV series. FX picked it up. In the United States, it’s set to premiere on Hulu later this year. But the exact date is still unknown so stay tuned. In the meantime, I highly recommend reading the book.

Michael Calore: Nice.

Lauren Goode: That’s great.

Michael Calore: Nice.

Lauren Goode: Our colleague, by our colleague. All right. That’s our show this week. Will, thank you so much for joining us.

Will Knight: Thanks for having me.

Lauren Goode: And thanks to all of you for listening. If you have feedback, you can find all of us on the site formerly known as Twitter. Worst name change ever.

Michael Calore: Bluesky.

Lauren Goode: Bluesky, that’s right. Just check the show notes. Our producer is the excellent Boone Ashworth and we’ll be back next week.

[Gadget Lab outro theme music plays]

Total
0
Shares
Leave a Reply

Your email address will not be published. Required fields are marked *

Previous Article

President Biden's Executive Order: No More Sales of US Data to China and Russia - A Daring Move.

Next Article

Unpacking Google's Latest Deal With Stack Overflow: A Testament to AI Giants Investing in Data

Related Posts