
The AI Startup Helping Uber, Salesforce And Hundreds Of Companies Cut Costs
But Visa's executives, who wanted to roll out a digital payment product in 40-plus languages, had only ever worked with human translators and didn't understand how her software would integrate with theirs. So Habib, now 40, stepped up to a whiteboard and mapped it out. And when they discovered a gap, she and her cofounder, Waseem Alshikh, went back to their house-cum-office in the Mission District and cranked out a GitHub integration to fill it.
Guerin Blask for forbes
Visa became Habib's first major enterprise client with a $126,000 contract soon after, and she raised $5 million a few months later. 'You're not selling your software, you're selling a different way of doing things,' she says.
That's the conceit behind Writer, Habib's artificial intelligence company, which has evolved dramatically since those early days. It now sells AI Studio, a Swiss Army knife suite of AI tools intended to expedite the corporate world's many simple, but often tedious and expensive, menial tasks. For cosmetics giant L'Oréal, Writer drafted thousands of product description blurbs, for Uber hundreds of answers to frequently asked questions. Salesforce uses it to quickly gin up email and social media marketing campaigns. Those are just three of the 300 companies that pay—sometimes millions—to use Writer's customizable AI apps to automate time-consuming everyday work. The enthusiastic embrace of enterprise has helped Writer, one of the sizzling startups featured on Forbes' annual AI 50 list (see page 82), raise some $320 million from top venture capitalists. Its November $200 million round valued the company at $1.9 billion; Habib retains an estimated 15% stake worth $285 million.
Forbes
At a time when so many companies are trying to figure out exactly how AI can help grow their business—and whether the investment is worth it—Writer's customers are using its tools to cut costs in a material way. An AI executive at a top health product retailer says their team uses Writer's AI to advertise on TikTok, Amazon and Walmart, which has generated $5 million in value annually between cost savings and new sales opportunities—a number they expect to balloon to $25 million in the next two years. Victoria's Secret–owned lingerie brand AdoreMe used Writer's AI to translate 2,900 product descriptions into Spanish as part of its expansion to Mexico, distilling a months-long process down to 10 days. 'The ROI is screaming at you,' says Sandesh Patnam, managing partner at private equity firm Premji Invest, who co-led Writer's funding round last year.
Such savings have spiked Writer's net retention rate to a stunning 160%, indicating that customers end up expanding their contracts by 60% on average. Habib says 20 customers started with contracts between $200,000 and $300,000, quickly found new ways to use Writer's tools and are now spending about $1 million each. A pitch deck from last fall shows it brought in $9.3 million in revenue in 2023 and was forecasting $28 million in revenue for 2024. Writer said the figures were inaccurate and declined to comment further, but shared that it currently has more than $50 million in signed contracts, which it projects will double to $100 million this year.
Writer's confidence stems from its newest product, 'AI HQ,' which includes tools to build artificially intelligent agents that can perform a series of tasks that are typically part of the workflow of an actual job. A financial analyst, for instance, could create an agent that pulls data from earnings-call transcripts, analyzes it and emails a personalized version to their entire client list. No coding is required; you just explain in plain English the sequence of steps you want the agent to carry out and click a button. At an annual cost that can reach into the millions, Writer also provides more than 70 pregenerated apps and agents that clients can use immediately. 'People don't have to do the work,' Habib says. 'They only have to build AI that does it for them.'
Already, major customers like fintech Intuit ($16.3 billion in 2024 revenue) and homebuilder Lennar (sales: $35.4 billion) are trying out Writer's agent tools. Lennar CTO Scott Spradley says his team's Writer agents have so far written thousands of email responses to inquiries from potential buyers, scheduling appointments to view the company's thousands of homes and providing information like their price or location. 'It's driving volume, it's driving activity, it's creating better leads,' he says.
It also heralds the beginning of a paradigm shift for businesses looking to shave labor costs. Why bother paying people to do what an AI agent can do equally well and faster? That's the real opportunity for Writer and the rest of the $58 billion enterprise AI market. Habib is blunt about what this means for rank-and-file knowledge workers: 'Ten percent of the headcount is going to be enough.'
With the enterprise AI software market poised nearly to double to $114 billion by 2027, competition is stiff. Deep-pocketed OpenAI and Anthropic, which have raised a collective $42 billion, sell bare-bones models that businesses can then use to build tools. But they typically require a team of developers to calibrate, deploy and update. Writer's tech is largely plug-and-play, with user-friendly drag-and-drop interfaces. It doesn't require messing around with AI models or crafting the perfect prompt every time. 'We make so much of the magic invisible for folks,' Habib says.
It has done that by home-brewing the models that power AI Studio and AI HQ. That's important for security: Client data is retrieved from dedicated servers and isn't used to train models, mitigating concerns about sensitive information leaking. Writer's models address another core concern businesses have with AI: its occasional tendency to make things up. Writer's AIs pull data directly from customers' documents, guaranteeing far fewer hallucinations, if less creativity. But who needs poetry when your AI is generating market analysis?
In a world in which OpenAI spent $100 million to train GPT-4, Writer is doing all this relatively cheaply. Its rival model cost just $700,000 to build. That's far less even than DeepSeek, the Chinese company that upended the AI world by building a model that competes with OpenAI's at a fraction of the cost. 'DeepSeek made efficiency cool, but Writer has been doing it for years,' says Rob Toews, a Writer board member and partner at Toronto-based VC shop Radical Ventures, which is an investor.
Not everyone is sold. Some industry experts are unsure whether Writer's smaller models can keep up with the giants. On popular leaderboards, which rank models based on how well they answer a host of questions, Writer trails AI juggernauts like OpenAI and Anthropic, whose revenue is in the billions. OpenAI has 2 million paying users for ChatGPT Enterprise, and Anthropic reportedly projects it will grow to $34.5 billion in revenue by 2027, two-thirds of which will come from enterprise users. 'Writer's great strength has been evangelizing what they're doing. I think the challenge is making sure the technology can live up to that,' says one venture capitalist who has invested in other enterprise AI companies.
Writer trained its own models (named Palmyra after the ancient Syrian city) even though everyone advised them against it. 'You don't need to play by the rules all the time," says CTO Waseem Alshik."
Habib is confident it can. Companies care more about real-world performance than benchmark horse races. And with a client roster that already includes blue-chip outfits like Accenture, Hilton, Spotify and Qualcomm, Writer's investors are happy to continue betting on her. 'She just can run through walls and make the impossible happen,' Toews says. Even VCs who passed on the last round speak admiringly of her. 'She's just kind of a force of nature,' says one.
Entrepreneurship runs in the family, says Habib, who grew up in a tiny village on the war-torn border of Lebanon and Syria. Her father started his own tool-and-die shop, while her mother worked at a pita bakery. The eldest of eight siblings, Habib was managing the family checkbook by age 9. Her family fled the civil war to Canada in 1990, and Habib, the only one in her family who spoke English, graduated from Harvard in 2007 with a degree in economics and a minor in Eastern languages. While working as an investment banker in Dubai, she met Alshikh, 40, a Syrian technology executive who had taught himself to speak English so he could learn to code. Alshikh's first company, which converted satellite photos into digital maps for cars, was taken over by the Syrian government because it thought he wasn't 'patriotic' enough, he says; in retaliation, he claims he hacked into the government's servers to shut down the internet in the country.
The duo launched the first iteration of Writer in 2015 as machine translation company Qordoba, before pivoting to build AI that could generate content in a company's style and tone, relaunching as Writer in 2020. Twitter became one of its first customers, using Writer's system to pump out blog posts. (After Elon Musk bought Twitter in 2022, he stopped paying; Writer sued and eventually got 95% of what it was owed. X, as Twitter is now known, did not respond to a request for comment.)
Habib jokes that Writer has evolved into a different company every four to six months to stay relevant. 'We are the Eras Tour of generative AI,' she says, nodding to Taylor Swift's career-timeline concerts. 'We extract the rate of change for companies that can't get their heads around it on their own.'
It's fitting, then, that the next stage of Writer's business involves an entirely new approach to AI. Alshikh calls it 'self-evolving.' His team is building models that learn from mistakes automatically, without needing a human to correct them. 'It's like hiring smart people in your companies,' he says. 'You expect them over time to know more and learn more.'
For Habib, this means new possibilities, and she can't wait to get in the room with her customers to sketch them out together. 'Even now, the team knows there has to be a whiteboard,' she says. 'I can't co-think or co-create with a customer without trying to imagine something that doesn't exist.'
Additional reporting by Richard Nieva.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
7 minutes ago
- Yahoo
Chinese AI firms form alliances to build domestic ecosystem amid US curbs
SHANGHAI (Reuters) -China's artificial intelligence companies have announced two new industry alliances, aiming to develop a domestic ecosystem to reduce dependence on foreign tech as they seek to cope with U.S. export restrictions on advanced Nvidia chipsets. The announcements were timed to coincide with the three-day World Artificial Intelligence Conference in Shanghai ending on Monday. The conference also showcased a slew of new products, such as an AI computing system from Huawei that experts believe rivals Nvidia's most advanced offering, as well as consumer-friendly products such as several kinds of digital AI glasses. The "Model-Chip Ecosystem Innovation Alliance" brings together Chinese developers of large language models (LLMs) and AI chip manufacturers. "This is an innovative ecosystem that connects the complete technology chain from chips to models to infrastructure," said Zhao Lidong, CEO of Enflame, one of the participating chipmakers. Other manufacturers of graphics processing units (GPUs) in the alliance include Huawei, Biren, and Moore Threads, which have been hit by U.S. sanctions that block them from purchasing advanced tech made with U.S. know-how. The alliance was announced by StepFun, an LLM developer. A second alliance, the Shanghai General Chamber of Commerce AI Committee, aims to "promote the deep integration of AI technology and industrial transformation." Participants include SenseTime, also sanctioned by the U.S. and which has pivoted from facial recognition technology to LLMs. Others are StepFun and another LLM developer, MiniMax, as well as chipmakers Metax and Iluvatar CoreX. One of the most talked about products at the conference was Huawei's CloudMatrix 384 which incorporates 384 of its latest 910C chips and outperforms Nvidia's GB200 NVL72 on some metrics, according to U.S. research firm SemiAnalysis. Huawei's system design capabilities have meant that it has been able to use more chips and system-level innovations to compensate for weaker individual chip performance, SemiAnalysis said. At least six other Chinese computing firms showcased similar "clustering" chip technology. Metax demonstrated an AI supernode featuring 128 C550 chips designed to support large-scale liquid-cooled data centre requirements. Other events included Tencent's unveiling of its open-source Hunyuan3D World Model 1.0, which the company said enables users to generate interactive 3D environments through text or image prompts. Baidu announced what it said was next-generation "digital human" technology that helps businesses to create virtual livestreamers. It features "cloning technology" that can replicate a human's voice, tone, and body language from just 10 minutes of sample footage. Alibaba was among those announcing AI glasses. Its Quark AI Glasses are powered by its Qwen AI model and are due to be released in China by the end of 2025. They will allow users to access the tech giant's map service for easy navigating and to use Alipay by scanning QR codes with voice commands. Connectez-vous pour accéder à votre portefeuille


Axios
9 minutes ago
- Axios
Musk announces Tesla, Samsung Electronics chip supply deal
Tesla CEO Elon Musk said late Sunday the automaker has signed a deal to obtain semiconductor chips from Samsung Electronics. The big picture: Samsung had announced Saturday that it had struck a $16.5 billion supply agreement, but it didn't name the company. The announcement comes after Samsung, one of the world's largest memory chip makers, acknowledged last year that it had fallen behind in the AI chips war. Driving the news: "Samsung's giant new Texas fab will be dedicated to making Tesla's next-generation AI6 chip. The strategic importance of this is hard to overstate," Musk wrote on his platform X. "Samsung currently makes AI4. TSMC will make AI5, which just finished design, initially in Taiwan and then Arizona," Musk added. "Samsung agreed to allow Tesla to assist in maximizing manufacturing efficiency. This is a critical point, as I will walk the line personally to accelerate the pace of progress. And the fab is conveniently located not far from my house."


Forbes
39 minutes ago
- Forbes
Mistral AI's Environmental Audit Puts Spotlight On AI's Hidden Costs
Mistral AI Mistral AI has quantified the environmental price of artificial intelligence with unprecedented transparency, releasing what appears to be the first comprehensive lifecycle assessment of a large language model. The French AI startup's detailed analysis of its Mistral Large 2 model reveals that training alone generated 20,400 metric tons of carbon dioxide equivalent and consumed 281,000 cubic meters of water over 18 months. This disclosure comes as enterprises face dual pressures - implementing AI to stay competitive while fulfilling sustainability commitments. The audit provides decision-makers with concrete data points that were previously hidden behind industry opacity, enabling more informed technology adoption strategies. The numbers from Mistral's assessment illustrate the resource intensity of AI. Training the 123 billion parameter model required energy equivalent to 4,500 gasoline-powered cars operating for a year, while water consumption matched filling 112 Olympic-sized swimming pools. Each individual query through Mistral's Le Chat assistant generates 1.14 grams of CO2 equivalent and consumes 45 milliliters of water, roughly equivalent to growing a small radish. Mistral AI More significantly, the analysis reveals that operational phases have a greater impact on the environment. Training and inference account for 85% of water consumption, far exceeding the environmental cost of hardware manufacturing or data center construction. This operational dominance means that environmental costs accumulate continuously as model usage scales up. Mistral's research identifies actionable strategies for reducing environmental impact. Geographic location has a significant influence on carbon footprint, with models trained in regions with renewable energy and cooler climates exhibiting markedly lower emissions. The study demonstrates a strong correlation between model size and environmental cost, with larger models generating impacts roughly one order of magnitude higher for equivalent token generation. These findings suggest specific optimization approaches. Enterprises can reduce environmental impact by selecting appropriately sized models for specific use cases rather than defaulting to larger, general-purpose systems. Continuous batching techniques that group queries can minimize computational waste, while deploying models in regions with clean energy grids substantially reduces carbon emissions. Mistral's disclosure strategy differs significantly from that of its competitors. While OpenAI CEO Sam Altman recently claimed ChatGPT queries consume just 0.32 milliliters of water per request, the lack of a detailed methodology makes meaningful comparison difficult. This transparency gap presents opportunities for companies willing to provide comprehensive environmental data, allowing them to differentiate themselves competitively. The audit establishes environmental transparency as a key differentiator in the enterprise AI market. As sustainability metrics increasingly influence procurement decisions, vendors providing detailed environmental impact data gain advantages in enterprise sales cycles. This transparency enables more sophisticated vendor evaluations that balance performance requirements against environmental costs. For technology executives, Mistral's audit provides decision-making criteria previously unavailable. Organizations can now factor environmental impact into AI procurement decisions, alongside traditional metrics such as performance and cost. The data enables more sophisticated total cost of ownership calculations that include environmental externalities. Looking ahead, environmental performance may become as critical as computational performance in selecting AI vendors. Organizations that establish environmental accounting practices now position themselves advantageously as regulatory requirements expand and stakeholder scrutiny intensifies. The Mistral audit demonstrates that detailed environmental measurement is feasible, potentially making opacity from other vendors increasingly untenable in enterprise markets.