OpenAI’s Rapid Growth and Departure of Key Engineer Highlight Industry Challenges

Here, OpenAI, the nonprofit artificial intelligence research organization, has experienced meteoric growth. This boom has led to much debate about what it means. As of March 2023, OpenAI reported over 500 million active users engaging with its large language models (LLMs) for various applications, including medical advice and therapy. Given the rapid proliferation of these…

Lisa Wong Avatar

By

OpenAI’s Rapid Growth and Departure of Key Engineer Highlight Industry Challenges

Here, OpenAI, the nonprofit artificial intelligence research organization, has experienced meteoric growth. This boom has led to much debate about what it means. As of March 2023, OpenAI reported over 500 million active users engaging with its large language models (LLMs) for various applications, including medical advice and therapy. Given the rapid proliferation of these vehicles, concerns have mounted about the company’s commitment to safety and building user trust.

That comes amid skyrocketing growth. In one year, OpenAI grew its workforce from 1,000 to 3,000 employees. Our continued focus is on improving our editorial product. This unprecedented leap in staff seems to be necessary given its ballooning user base. This quickly executed scaling hasn’t been done without its share of hiccups.

Calvin French-Owen, a former engineer at OpenAI who recently resigned, just three weeks ago, he was instrumental in helping create one of the company’s most exciting new products. He departed OpenAI to go back to where he got started as a startup founder. French-Owen is best known as the co-founder of Segment, a customer data startup that Twilio bought for $3.2 billion in 2020. His unexpected departure raises profound questions not just about personal ambition but about the larger org chart power struggles at OpenAI.

French-Owen was clear that internal “drama” was not the cause of his departure. Rather, he focused on how the tech giant’s explosive growth had overwhelmed its communication and reporting structure, and turned into a toxic broth of organizational complacency.

“Everything breaks when you scale that quickly: how to communicate as a company, the reporting structures, how to ship product, how to manage and organize people, the hiring processes, etc.” – Calvin French-Owen

To infuse that innovation back into their other products, OpenAI has gathered a team uniquely tailored for commercialization—engineers, researchers, designers, go-to-market staff, and product managers. Notably, a small but dedicated team built and launched Codex within an impressive seven-week timeframe, often working long hours with minimal rest. This undertaking required an army of eight engineers, four researchers, two designers, two go-to-market staffers and one product manager.

French-Owen pointed out potential risks associated with AI technologies:

“hate speech, abuse, manipulating political biases, crafting bio-weapons, self-harm, prompt injection.” – Calvin French-Owen

Nearly a million people are now using OpenAI’s tools. This sudden explosion underscores the need for robust safety protocols and ethical principles governing AI development and fast. The organization is experiencing immense growth. It now finds itself at a crossroads between its historic commitment to safety and its new, deeply ambitious growth targets.

A friend of French-Owen described the company’s culture as being influenced by external factors:

“this company runs on twitter vibes.” – Friend of Calvin French-Owen

As OpenAI navigates this complex landscape of growth and responsibility, it is essential for the organization to address both internal and external perceptions about its commitment to user safety. The departure of key personnel like French-Owen serves as a reminder of the challenges faced by rapidly expanding companies in the tech sector.