
The AI-Readiness Crisis: Why Businesses Can't Wait for Universities
More than half of recent college graduates don't know if their colleges have prepared them for "the use of generative AI," according to a Cengage Group cited by Higher Ed Dive, while about 66% of employers feel that potential job candidates should have "foundational knowledge" of generative AI tools.
This disconnect is causing problems for the hiring process—especially, in my experience, for recent computer science grads.
In my role, I've spoken with dozens of bright developers who have never used GitHub Copilot, have no idea how machine-learning pipelines or observability frameworks work and don't comprehend how SHAP or LIME describe models. Let me be clear: This is a reflection of how quickly industry changes and how slowly academia adjusts, not of these developers' abilities.
Having managed international AI engineering teams, I've seen how these gaps manifest locally as longer onboarding times, a lack of trust in AI tools and lost chances to streamline development cycles. To solve this challenge, companies can't wait for universities to catch up to AI. They need to find ways to train engineers to adapt to this ever-changing technology.
According to a 2024 survey from Kyndryl, 71% of business leaders feel their workforce is not yet ready to leverage AI, with many citing the lack of "skilled talent needed to manage AI" as a major reason. Companies without an AI-savvy workforce are compelled to postpone deployments, contract out critical functions or deal with poor product quality.
The good news is that, according to McKinsey, nearly half of employees want more formal AI training, but other research shows that only 31% of employers are providing AI training. When AI talent can't be hired fast enough, it must be developed from within.
Big Tech is already working on this, with Microsoft, Google, IBM, Intel, SAP and Cisco collectively planning to train over 100 million workers.
I've seen the success of these types of programs at my own company, where we set up an internal AI bootcamp, started hands-on labs that were directly related to real-world projects and matched junior engineers with mentors who had experience with AI.
To promote practical upskilling, we also host project-focused webinars, arrange hackathons with an AI focus and assign structured learning paths on sites like Udemy to guarantee ongoing improvement.
Based on these experiences, here are five ways to bridge the AI skills gap at your organization:
1. Launch an internal AI learning program. Instead of using pre-made tutorials, create learning tracks centered on actual issues that your engineers encounter, such as using AI for CI/CD optimization, auto-generating test cases or enhancing search relevance with natural language processing.
2. Make AI a core part of DevOps. AI is not an "optional add-on." Tools like Amazon CodeWhisperer and GitHub Copilot are quickly taking over as the standard. Integrate them into documentation procedures, deployment flows and code reviews.
3. Promote peer mentorship. While formal training has its place, one-on-one, contextual mentoring frequently works better. Establish "AI champion" positions and facilitate team members' real-time shadowing and learning.
4. Measure AI tool adoption. Keep track of how often engineers use AI tools for backlog grooming, testing, debugging and code commits. Organize frequent hackathons or internal demonstrations centered on AI-enhanced engineering.
5. Partner with academic institutions. Talk to the faculty at the schools you hire a lot from. Provide real-world problem statements, fund student projects with an AI theme or collaborate on developing modular course materials. It helps your brand and the talent pipeline.
There is no longer any room for speculation regarding the move toward AI-native development. It has already arrived. In addition to writing code, developers are now expected to work with machines to direct and verify the output of AI. Businesses that don't facilitate this change will experience increased turnover, higher training expenses and decreased developer productivity.
On the other hand, companies will gain a compounding advantage if they make AI fluency a strategic capability for all engineers, not just data scientists. They will attract top talent who wish to build for the future rather than the past, ship more quickly and adapt better.
Don't wait for the AI gap to be filled by higher education. Begin within your organization. Invest in mentorship, align tooling with learning and cultivate an internal culture of AI fluency. The ability to code with AI is more important than simply knowing how to code in the future of software engineering.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
10 minutes ago
- Forbes
Teachers Get A New Assistant: Instructure Drops AI Into Canvas
AI in education Instructure and OpenAI have announced a new partnership to bring LLM-powered AI technology into Canvas, one of the most widely used learning platforms in education. The collaboration introduces IgniteAI, a built-in set of generative AI tools that will be released to Canvas users in stages over the coming year. Where AI is Adding Value in Canvas A key piece of the IgniteAI rollout is a new assignment builder that lets educators create AI-guided tasks. Teachers can write learning goals and sample prompts, set up how the chatbot will interact with students, and define how outcomes should be evaluated. At the same time, Canvas's grading system, analytics tools, and content creation features get new automation support, from faster feedback to AI-generated rubrics. Teachers stay in full control of how the AI behaves. They can customize each prompt and review all chatbot responses. Meanwhile, students get a chance to have focused conversations with the AI inside Canvas, working through ideas at their own pace. All chats are visible to the instructor, and the company says student data stays local and is not shared with OpenAI. The system also tracks each student's interaction. When learners show understanding or make progress, those moments are captured and added to the Gradebook. That lets teachers see not just the end result, but how a student arrived there. Repetitive tasks such as rewriting rubrics, responding to common requests and drafting feedback are handled by the system, allowing instructors to focus on discussion, coaching, and more complex teaching. "We're committed to delivering next-generation LMS technologies designed with an open ecosystem that empowers educators and learners to adapt and thrive in a rapidly changing world," said Steve Daly, CEO of Instructure. "This collaboration with OpenAI showcases our ambitious vision: creating a future-ready ecosystem that fosters meaningful learning and achievement at every stage of education.' Opportunities and Tradeoffs Daly says this partnership will free up time for educators and give students a more flexible way to engage with lessons. Leah Belsky, who oversees education strategy at OpenAI, describes the tools as a way to offer 'more personalized and connected learning experiences,' without removing human oversight. Schools are already moving quickly. Surveys show education leading all sectors in generative-AI adoption. Early feedback from pilots suggests students feel more confident when they can test ideas in a private chat, and some classroom studies point to modest gains in test scores among students using AI for practice. Still, the tools raise concerns. Nearly half of faculty respondents in recent polls say they worry about bias in model outputs. A similar number cite data privacy as a top issue. Those who work on academic integrity expect new forms of cheating to emerge. Others warn that expensive AI licenses could deepen gaps between well-funded and under-resourced schools. And until teachers are fully trained on how to use the tools, confusion and uneven results are likely. A university survey from May 2025 confirmed many of these fears among students. Respondents cited grading fairness, misuse of AI for shortcuts, and the risk of over-relying on automated suggestions as top concerns. Faculty echoed those points. They questioned whether AI nudges weaker writers toward overly similar phrasing and whether automated grading could undermine trust. To reduce those risks, campuses are already setting up review boards, bias checks, and clear opt-out options. Instructure, for its part, says that all student data stays within the institution, and that OpenAI has no access to individual records. Privacy teams are expected to monitor that closely. Where This Leads Canvas is now placing AI tools where teaching already happens—in assignments, discussions, and grading workflows. The chatbot becomes part of the lesson, not just an external add-on. If the systems work as intended, teachers could gain clearer feedback and students could move beyond generic answers into more thoughtful, process-based work. If the technology fails to live up to that promise, trust may erode. Either way, AI is no longer sitting outside the classroom door. It's embedded, logged, and learning alongside everyone else.
Yahoo
38 minutes ago
- Yahoo
Google is getting a boost from AI after spending billions
Google parent Alphabet (GOOG, GOOGL) is finally starting to cash in on the billions of dollars it's spending on its rapid AI buildout. The company reported better-than-anticipated earnings after the bell on Wednesday, with CEO Sundar Pichai pointing to AI as a key growth catalyst for its various products. Google cloud revenue climbed 32% and backlog, or purchase commitments from customers not yet realized, rose 38%. Search also performed better than expected during the quarter, with sales increasing 12% year over year. Wall Street previously raised concerns that chatbots and search offerings from AI upstarts like OpenAI ( Perplexity ( and Anthropic ( would steal users from Google's own Search product. But according to Pichai, Search revenue grew by double digits, and its AI Overviews feature, the small box at the top of the traditional search page that summarizes information, now has 2 billion monthly users. But Google also announced it's pouring even more money into its AI development, saying in its earnings release that it will spend an additional $10 billion on the technology this year, bringing its total capital expenditures from $75 billion to $85 billion. Despite that, analysts are riding high on Google's stock. In a note to investors on Wednesday, Jefferies analyst Brent Thill said Google's results back up its increased spending. 'After hiccups in early '23, [Google's] AI efforts picked up urgency and have now delivered benchmark-leading Gemini 2.5 Pro models,' Thill wrote. 'This is starting to show up in [key performance indicators], with Cloud [revenue accelerating] to 32% [year over year] from 28%, tokens processed 2x to 980 [trillion] tokens since April, and search ad [revenue accelerating] to 12% from 10%. This confidence supports '25 [capital expenditures] raise to $85B.' Morgan Stanley's Brian Nowak offered a similar outlook for Google, raising the firm's price target on the tech giant from $205 to $210. Wedbush's Scott Devitt also raised his price target on the company to $225. Malik Ahmed Khan at Morningstar pointed out that while AI Overview searches are monetizing at the same rate as standard Google searches, 'AI Overviews are helping increase search volumes within Google Search, with the feature driving over 10% more queries, leading to additional sales within the Search segment.' But behind all of that are the potentially devastating consequences from a judge's decision that held it liable for antitrust violations in search. Judge Amit Mehta of the US District Court for the District of Columbia is expected to issue a ruling on "remedies" that follows the Justice Department's victory against the company sometime next month. Judge Mehta held that Google violated antitrust law by boxing out rivals in the online search engine and online search text markets. To restore competition, he could order Google to refrain from longstanding exclusivity deals like the one with Apple (AAPL) that set Google Search as the default option on the iPhone. Mehta could also force Google to sell off its Chrome browser, the most popular web browser in the world. That would put a dent in Google's all-important search business, a dangerous proposition for the Daniel Howley at dhowley@ Follow him on X/Twitter at @DanielHowley. Error while retrieving data Sign in to access your portfolio Error while retrieving data Error while retrieving data Error while retrieving data Error while retrieving data


The Verge
40 minutes ago
- The Verge
'I was given an offer that would explode same day.'
Posted Jul 24, 2025 at 7:12 PM UTC Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates. Alex Heath Posts from this author will be added to your daily email digest and your homepage feed. See All by Alex Heath Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Google Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech