Plug and Play closes $50 million Fintech & AI Fund to drive impact through direct access to global decision makers
SUNNYVALE, Calif., June 10, 2025 /PRNewswire/ -- Plug and Play, one of the world's most active early-stage investors, announced today during its Silicon Valley June Summit 2025 the closing of its $50 million Fintech & AI Fund with nine institutional investors. Plug and Play runs nine funds in total.
The fund is the largest industry-themed fund the company has raised to date and invests in companies globally. The fund reflects the evolution of the innovation journey taken by many high-profile financial services companies.
Over the years, many of Plug and Play's limited partners have actively participated in pilot and innovation programs and aided in collaborative development efforts. This participation reinforces the partners' trust in Plug and Play and its ability to recognize, fund, and scale innovative technologies.
"Plug and Play's ecosystem has been a valuable source of innovation and market insight," said Sandeep Manchanda, Head of Insurance M&A and Partnerships at EXL, one of the investors of the fund. "With this fund, we're taking that engagement even further - partnering earlier and more strategically with the AI-driven technologies shaping the next chapter of insurance and financial services."
Plug and Play made the fund announcement during the Enterprise & AI Expo, a part of the Silicon Valley June Summit 2025. The event features three days of more than 75 speakers and more than 200 startups coming together to discuss and demonstrate a range of technologies at the company's Sunnyvale headquarters.
"AI is changing everything and industry startups are scaling faster than ever," said Eugenio Gonzalez, Partner at Plug and Play. "The fund supports our value proposition of accelerating sales cycles by connecting companies with the right decision makers at global corporations. It is a key part of this dynamic ecosystem that includes a roster of entrepreneurs and corporations we've developed over the years. It reflects a shift from shorter-form experimentation to long-term value creation as this fund allows us to back exceptional founders earlier and support them more meaningfully as they build the future of fintech, enterprise, and insurtech."
In addition to capital, Plug and Play provides portfolio companies with access to a global network of over 550 corporate partners across more than 25 industries. This network provides startups with opportunities, including pilot projects, customer acquisition, and revenue growth. Plug and Play brings a strong track record, with more than 300 successful exits and a global portfolio of thousands of startups.
About Plug and PlayPlug and Play is the leading innovation platform, connecting startups, corporations, venture capital firms, universities, and government agencies. Headquartered in Silicon Valley, we're present in 60+ locations across five continents. We offer corporate innovation programs and help our corporate partners in every stage of their innovation journey, from education to execution. We also organize startup programs and have built an in-house VC to drive innovation across multiple industries where we've invested in hundreds of successful companies including Dropbox, Guardant Health, Honey, Turing, Lending Club, N26, PayPal, and Rappi. For more information, visit https://www.plugandplaytechcenter.com/.
© Plug and Play Financial Services Fund I, L.P. (legal entity of the Fintech & AI Fund)
Plug and Play Press ContactJacky TsangSenior Communications & PR Associatepress@pnptc.com
View original content to download multimedia:https://www.prnewswire.com/news-releases/plug-and-play-closes-50-million-fintech--ai-fund-to-drive-impact-through-direct-access-to-global-decision-makers-302477100.html
SOURCE Plug and Play
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Fast Company
17 minutes ago
- Fast Company
Think your ChatGPT therapy sessions are private? Think again.
If you've been confessing your deepest secrets to an AI chatbot, it might be time to reevaluate. With more people turning to AI for instant life coaching, tools like ChatGPT are sucking up massive amounts of personal information on their users. While that data stays private under ideal circumstances, it could be dredged up in court – a scenario that OpenAI CEO Sam Altman warned users in an appearance on Theo Von's popular podcast this week. 'One example that we've been thinking about a lot… people talk about the most personal shit in their lives to ChatGPT,' Altman said. 'Young people especially, use it as a therapist, as a life coach, 'I'm having these relationship problems, what should I do?' And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it, there's doctor patient confidentiality, there's legal confidentiality.' Altman says that as a society we 'haven't figured that out yet' for ChatGPT. Altman called for a policy framework for AI, though in reality OpenAI and its peers have lobbied for a regulatory light touch. 'If you go talk to ChatGPT about your most sensitive stuff and then there's a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up,' Altman told Von, arguing that AI conversations should be treated with the same level of privacy as a chat with a therapist. While interactions with doctors and therapists are protected by federal privacy laws in the U.S., exceptions exist for instances in which someone is a threat to themselves or others. And even with those strong privacy protections, relevant medical information can be surfaced by court order, subpoena or a warrant. Altman's argument seems to be that from a regulatory perspective, ChatGPT shares more in common with licensed, trained specialists than it does with a search engine. 'I think we should have the same concept of privacy for your conversations with AI that we do with a therapist,' he said. Altman also expressed concerns about how AI will adversely impact mental health, even as people seek its advice in lieu of the real thing. 'Another thing I'm afraid of… is just what this is going to mean for users' mental health. There's a lot of people that talk to ChatGPT all day long,' Altman said. 'There are these new AI companions that people talk to like they would a girlfriend or boyfriend. 'I don't think we know yet the ways in which [AI] is going to have those negative impacts, but I feel for sure it's going to have some, and we'll have to, I hope, we can learn to mitigate it quickly.'


CNET
17 minutes ago
- CNET
Even OpenAI's CEO Says Be Careful What You Share With ChatGPT
Maybe don't spill your deepest, darkest secrets with an AI chatbot. You don't have to take my word for it. Take it from the guy behind the most popular generative AI model on the market. Sam Altman, the CEO of ChatGPT maker OpenAI, raised the issue this week in an interview with host Theo Von on the This Past Weekend podcast. He suggested that your conversations with AI should have similar protections as those you have with your doctor or lawyer. At one point, Von said one reason he was hesitant to use some AI tools is because he "didn't know who's going to have" his personal information. "I think that makes sense," Altman said, "to really want the privacy clarity before you use it a lot, the legal clarity." More and more AI users are treating chatbots like their therapists, doctors or lawyers, and that's created a serious privacy problem for them. There are no confidentiality rules and the actual mechanics of what happens to those conversations are startlingly unclear. Of course, there are other problems with using AI as a therapist or confidant, like how bots can give terrible advice or how they can reinforce stereotypes or stigma. (My colleague Nelson Aguilar has compiled a list of the 11 things you should never do with ChatGPT and why.) Altman's clearly aware of the issues here, and seems at least a bit troubled by it. "People use it, young people especially, use it as a therapist, a life coach, I'm having these relationship problems, what should I do?" he said. "Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it." The question came up during a part of the conversation about whether there should be more rules or regulations around AI. Rules that stifle AI companies and the tech's development are unlikely to gain favor in Washington these days, as President Donald Trump's AI Action Plan released this week expressed a desire to regulate this technology less, not more. But rules to protect them might find favor. Read more: AI Essentials: 29 Ways You Can Make Gen AI Work for You, According to Our Experts Altman seemed most worried about a lack of legal protections for companies like his to keep them from being forced to turn over private conversations in lawsuits. OpenAI has objected to requests to retain user conversations during a lawsuit with the New York Times over copyright infringement and intellectual property issues. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) "If you go talk to ChatGPT about the most sensitive stuff and then there's a lawsuit or whatever, we could be required to produce that," Altman said. "I think that's very screwed up. I think we should have the same concept of privacy for your conversations with AI that you do with your therapist or whatever." Be careful what you tell AI about yourself For you, the issue isn't so much that OpenAI might have to turn your conversations over in a lawsuit. It's a question of whom you trust with your secrets. William Agnew, a researcher at Carnegie Mellon University who was part of a team that evaluated chatbots on their performance dealing with therapy-like questions, told me recently that privacy is a paramount issue when confiding in AI tools. The uncertainty around how models work -- and how your conversations are kept from appearing in other people's chats -- is reason enough to be hesitant. "Even if these companies are trying to be careful with your data, these models are well known to regurgitate information," Agnew said. If ChatGPT or another tool regurgitates information from your therapy session or from medical questions you asked, that could appear if your insurance company or someone else with an interest in your personal life asks the same tool about you. "People should really think about privacy more and just know that almost everything they tell these chatbots is not private," Agnew said. "It will be used in all sorts of ways."


TechCrunch
17 minutes ago
- TechCrunch
AI referrals to top websites were up 357% year-over-year in June, reaching 1.13B
AI referrals to websites still have a way to go to catch up to the traffic that Google Search provides, but they're growing quickly. According to new data from market intelligence provider Similarweb, AI platforms in June generated over 1.13 billion referrals to the top 1,000 websites globally, a figure that's up 357% since June 2024. However, Google Search still accounts for the majority of traffic to these sites, accounting for 191 billion referrals during the same period of June 2025. One particular category of interest these days is news and media. Online publishers are seeing traffic declines and are preparing for a day they're calling 'Google Zero,' when Google stops sending traffic to websites. For instance, The Wall Street Journal recently reported on data that showed how AI overviews were killing traffic to news sites. Plus, a Pew Research Center study out this week found that in a survey of 900 U.S. Google users, 18% of some 69,000 searches showed AI Overviews, which led to users clicking links 8% of the time. When there was no AI summary, users clicked links nearly twice as much, or 15% of the time. Similarweb found that June's AI referrals to news and media websites were up 770% since June 2024. Some sites will naturally rank higher than others that are blocking access to AI platforms, as The New York Times does, as a result of its lawsuit with OpenAI over the use of its articles to train its models. In the news media category, Yahoo led with 2.3 million AI referrals in June 2025, followed by Yahoo Japan (1.9M), Reuters (1.8M), The Guardian (1.7M), India Times (1.2M), and Business Insider (1.0M). In terms of methodology, Similarweb counts AI referrals as web referrals to a domain from an AI platform like ChatGPT, Gemini, DeepSeek, Grok, Perplexity, Claude, and Liner. ChatGPT dominates here, accounting for more than 80% of the AI referrals to the top 1,000 domains. The company's analysis also looked at other categories beyond news, like e-commerce, science and education, tech/search/social media, arts and entertainment, business, and others. Screenshot In e-commerce, Amazon was followed by Etsy and eBay when it came to those sites seeing the most referrals, at 4.5M, 2.0M, and 1.8M, respectively, during June. Among the top tech and social sites, Google, not surprisingly, was at the top of the list, with 53.1 million referrals in June, followed by Reddit (11.1M), Facebook (11.0M), Github (7.4M), Microsoft (5.1M), Canva (5.0M), Instagram (4.7M), LinkedIn (4.4M), Bing (3.1M), and Pinterest (2.5M). The analysis excluded the OpenAI website because so many of its referrals were from ChatGPT, pointing to its services. Across all other domains, the No. 1 site by AI referrals for each category included YouTube (31.2M), Research Gate (3.6M), Zillow (776.2K), (992.9K), Wikipedia (10.8M), (5.2M), (1.2M), Home Depot (1.2M), Kayak (456.5K), and Zara (325.6K).