- Meredith Whittaker, president of the Signal Foundation, gained a skepticism of profit-motivated companies while working at Google.
- Whittaker said the Signal app, which is encrypted, intentionally holds as little user data as possible.
- "On the record, loudly as possible, no!" Whittaker said, when asked if Signal plans to incorporate ChatGPT-like tools.
Meredith Whittaker took a top role at the Signal Foundation last year, moving into the nonprofit world after a career in academia, government work and the tech industry.
She's now president of an organization that operates one of the world's most popular encrypted messaging apps, with tens of millions of people using it to keep their chats private and out of the purview of big tech companies.
Whittaker has real-world reasons to be skeptical of for-profit companies and their use of data — she previously spent 13 years at Google.
Get Boston local news, weather forecasts, lifestyle and entertainment stories to your inbox. Sign up for NBC Boston’s newsletters.
After more than a decade at the search giant, she learned from a friend in 2017 that Google's cloud computing unit was working on a controversial contract with the Department of Defense known as Project Maven. She and other workers saw it as hypocritical for Google to work on artificial intelligence technology that could potentially be used for drone warfare. They started discussing taking collective action against the company.
"People were meeting each week, talking about organizing," Whittaker said in an interview with CNBC, with Women's History Month as a backdrop. "There was already sort of a consciousness in the company that hadn't existed before."
With tensions high, Google workers then learned that the company reportedly paid former executive Andy Rubin a $90 million exit package despite credible sexual misconduct claims against the Android founder.
Whittaker helped organize a massive walkout against the company, bringing along thousands of Google workers to demand greater transparency and an end to forced arbitration for employees. The walkout represented a historic moment in the tech industry, which until then, had few high-profile instances of employee activism.
"Give me a break," Whittaker said of the Rubin revelations and ensuing walkout. "Everyone knew; the whisper network was not whispering anymore."
Google did not immediately respond to a request for comment.
Whittaker left Google in 2019 to return full time to the AI Now Institute at New York University, an organization she co-founded in 2017 that says its mission is to "help ensure that AI systems are accountable to the communities and contexts in which they're applied."
Whittaker never intended on pursuing a career in tech. She studied rhetoric at the University of California, Berkeley. She said she was broke and needed a gig when she joined Google in 2006, after submitting a resume on Monster.com. She eventually landed a temp job in customer support.
"I remember the moment when someone kind of explained to me that a server was a different kind of computer," Whittaker said. "We weren't living in a world at that point where every kid learned to code — that knowledge wasn't saturated."
'Why do we get free juice?'
Beyond learning about technology, Whittaker had to adjust to the culture of the industry. At companies like Google at the time, that meant lavish perks and a lot of pampering.
"Part of it was trying to figure out, why do we get free juice?" Whittaker said. "It was so foreign to me because I didn't grow up rich."
Whittaker said she would "osmotically learn" more about the tech sector and Google's role in it by observing and asking questions. When she was told about Google's mission to index the world's information, she remembers it sounding relatively simple even though it involved numerous complexities, touching on political, economic and societal concerns.
"Why is Google so gung-ho over net neutrality?" Whittaker said, referring to the company's battle to ensure that internet service providers offer equal access to content distribution.
Several European telecommunications providers are now urging regulators to require tech companies to pay them "fair share" fees, while the tech industry says such costs represent an "internet tax" that unfairly burdens them.
"The technological sort of nuance and the political and economic stuff, I think I learned at the same time," Whittaker said. "Now I understand the difference between what we're saying publicly and how that might work internally."
At Signal, Whittaker gets to focus on the mission without worrying about sales. Signal has become popular among journalists, researchers and activists for its ability to scramble messages so that third parties are unable to intercept the communications.
As a nonprofit, Whittaker said that Signal is "existentially important" for society and that there's no underlying financial motivation for the app to deviate from its stated position of protecting private communication.
"We go out of our way in sometimes spending a lot more money and a lot more time to ensure that we have as little data as possible," Whittaker said. "We know nothing about who's talking to whom, we don't know who you are, we don't know your profile photo or who is in the groups that you talk to."
Tesla and Twitter CEO Elon Musk has praised Signal as a direct messaging tool, and tweeted in November that "the goal of Twitter DMs is to superset Signal."
Musk and Whittaker share some concerns about companies profiting off AI technologies. Musk was an early backer of ChatGPT creator OpenAI, which was founded as a nonprofit. But he said in a recent tweet that it's become a "maximum-profit company effectively controlled by Microsoft." In January, Microsoft announced a multibillion-dollar investment in OpenAI, which calls itself a "capped-profit" company.
Beyond just the confusing structure of OpenAI, Whittaker is out on the ChatGPT hype. Google recently jumped into the generative AI market, debuting its chatbot dubbed Bard.
Whittaker said she finds little value in the technology and struggles to see any game-changing uses. Eventually the excitement will decline, though "maybe not as precipitously as like Web3 or something," she said.
"It has no understanding of anything," Whittaker said of ChatGPT and similar tools. "It predicts what is likely to be the next word in a sentence."
OpenAI did not immediately respond to a request for comment.
She fears that companies could use generative AI software to "justify the degradation of people's jobs," resulting in writers, editors and content makers losing their careers. And she definitely wants people to know that Signal has absolutely no plans to incorporate ChatGPT into its service.
"On the record, loudly as possible, no!" Whittaker said.
WATCH: AI hype is real