Skip to page content

Alchemist Accelerator loses new managing director to AI startup


Alchemist Accelerator President Rachel Chalmers
Rachel Chalmers is president of Alchemist Accelerator, the Bay Area incubator program for startups that focus on enterprise customers.
Alchemist Accelerator

After just eight months on the job, the new managing director at Alchemist Accelerator is stepping away to join a new AI startup. 

Rachel Chalmers joined the Bay Area accelerator three years ago to run its innovation arm and was appointed managing director in September when founder Ravi Belani stepped down.

Chalmers had planned to spend the rest of her career working with enterprise-focused startups at Alchemist, but then a former colleague reached out to her about an opportunity in AI.

"I have done a 180o on the technology, on the space and on my career goals," Chalmers told me. "Sometimes life just drops an opportunity in your lap that you would kick yourself if you missed."

On Tuesday, Alchemist also held its semiannual demo day, which included four new graduates from the Bay Area. The accelerator, which is focused on supporting startups developing enterprise software, has $58 million in assets under management and more than 400 companies currently in its portfolio, according to PitchBook.

Belani will return as the interim managing director until a permanent replacement is found.

I spoke with Chalmers about why she changed her mind on AI, if there's overlap with Web3 and why businesses should always center people in conversations around technology.

What changed your mind on AI? People I genuinely respected digging into the technology and saying this is not like crypto, this is real. Reading about people who use the technology to augment their own intelligence. And realizing myself that what I had originally disliked — this positioning of AI as a way to eliminate creative jobs and to replace writers and to replace artists — you can flip that math and say, what if we built tools that made artists and writers and presidents of accelerated companies and journalists much, much better at their job?

That made the hair on the back of my neck stand up because it reminded me of the early days of the web. I had intended to be an English professor. I went into journalism. Finding the web, looking at “view source,” and teaching myself HTML gave me the next 25 years of my career. If we can build products that are similarly accessible to non-programmers, to make them able to be autonomous agents for the next 25 years of their career, we can have more impact than the web had.

Job losses are a real concern for people, and AI is a sticking point in the WGA strike. What do you say to those people? I really want writers to paid what they're worth. I thought very deeply about what the large language models are. Essentially, what OpenAI has done is sucked down content on the open web. It's public domain content.

If you come at it from the other side and we can all participate on equal terms in interacting with the corpus of human knowledge and extending it — that redefines labor to be, what is it that you uniquely can do that you bring to the market that's not already commodified by all of the other ideas everyone's already had? That starts to get really interesting. What are the things that humans can do that machines cannot do for the foreseeable future? Suddenly, a whole bunch of jobs that had been routinely devalued by these exact same tech bros pop into light as extraordinarily valuable human work.

It's teaching kindergarten, it's nursing the dying, it's building relationships, it's working with people to improve their skills. It's all of the stuff that actually gives us a lot of joy in our work. It's none of the routine stuff. It's all of the intuitive, creative, interpersonal things that make us human. The playful, the unexpected, the funny.

What do you think needs to take place to help people become more comfortable with the idea of generative AI being a bigger part of their work, while also having tangible protections for jobs? I think the workers need to seize control of the means of production. What OpenAI has done is in some ways theft. My frickin' personal blog is in there, and I never gave permission for that. I won't get a cut of the money that OpenAI will make from it.

So, how do you deal with the fact that's already happened, and data that's already in the public domain will be commodified? You think about what you can create in the future. And you think about how you can put ethical boundaries around how that's shared. Maybe that means you're running your own LLM model on your own hardware, and you have a broker layer which writes contracts with people who want to ingest that data and makes sure that you're fairly compensated.

Like the move from publishing in the old-fashioned sense to blogging. If you own your own web server and can publish your own blog. It does feel in that sense like a second chance at the open web. It feels as if we can move away from these highly centralized, highly monetized corporatist models of creative production to something which is within the power of ordinary people to control again.

This sounds like it might be somewhere in the Web3 realm? I think the decentralized piece of Web3 reflected a very deep desire on the part of participants to own their own piece of the digital world. I think a better model is Mastodon. People can set up their own Mastodon servers, the servers are federated, but there's an enormous amount more control. It's a little bit more difficult to navigate, but it doesn't have the single point of failure that is Jack (Dorsey) selling the company to Elon (Musk).

Do you see a decentralized, blockchain-powered future for the web? Outside my local cafe, there was a sticker on the bike rack saying, “There is no problem that blockchain solves better than existing alternative technology.” That's what San Francisco's like, but that's my belief. Blockchain is: What if you had a distributed database, but made it really slow? It's good for providence. I'm not sure it's good for anything else.

What's interesting about the Web3/generative AI overlap is that people really, really want decentralization. They really don't want their favorite services to be vulnerable to somebody else's whim. There needs to be a little bit more control at the edge and a little bit less overwhelming consolidation at the center.

How should enterprise corporations think about the way they invest in things like generative AI or the metaverse? You will never go wrong investing in people. The future is gonna belong to the people who built the smartest teams. Generative AI can make smart people even smarter. What I am investing my career in, what I have invested the last five years of my career and will continue to do so because it's been the best returns of anything I've ever done, is making teams smarter and better. Building high trust environments, building blameless culture, trying to create conditions where amazing people can do their best work. That's always going to be a winning strategy, no matter what the market does.


Keep Digging

News
News
News
Fundings
Fundings


SpotlightMore

Raghu Ravinutala, CEO and co-founder, Yellow Messenger
See More
Image via Getty
See More
SPOTLIGHT Awards
See More
Image via Getty Images
See More

Upcoming Events More

Aug
01
TBJ
Aug
22
TBJ
Aug
29
TBJ

Want to stay ahead of who & what is next? Sent twice-a-week, the Beat is your definitive look at the Bay Area’s innovation economy, offering news, analysis & more on the people, companies & ideas driving your city forward. Follow the Beat

Sign Up