Latest news with #MohamedbinZayedUniversityforArtificialIntelligence


Arabian Post
2 days ago
- Science
- Arabian Post
Robots Learn Delicate Touch with Tactile Skills AI
Mohamed bin Zayed University for Artificial Intelligence has unveiled a pioneering embodied‑AI framework called Tactile Skills, enabling robots to master intricate physical tasks with human‑level precision. Spearheaded by Sami Haddadin, MBZUAI's vice‑president for research, and published in Nature Machine Intelligence on 23 June 2025, the approach promises to bridge a long‑standing divide between human dexterity and robotic automation. The system leverages a structured curriculum inspired by vocational training and neurobiology. Host‑defined process taxonomies guide robots through tactile subtasks—such as connector alignment and material handling—streamlining learning and reducing dependence on trial‑and‑error methods. In trials, robots achieved near‑100 per cent success across 28 industrial tasks, including plug insertion and precision cutting, even when conditions varied unexpectedly. Haddadin emphasised the breakthrough: the framework 'bridges the gap between human expertise and robotic capability… reliably mastering intricate tasks with precision and adaptability'. Unlike conventional machine‑learning methods, Tactile Skills combines expert knowledge and reusable haptic control modules, reducing energy consumption and set‑up time while achieving industrial‑grade speed and accuracy. ADVERTISEMENT Crucially, the architecture appears to democratise automation. Operators without extensive robotics training deployed the system effectively, signalling a shift towards accessible, flexible automation across sectors. In one demonstration, robots assembled a complex bottle‑filling device, underscoring real‑world relevance. The emergence of Tactile Skills arrives amid broader momentum in physical AI, where robots are evolving beyond pre‑programmed sequences to exhibit embodied intelligence. Google DeepMind recently released an on‑device version of its Gemini Robotics model, enabling vision‑language‑action capabilities offline and requiring only 50–100 demonstrations to learn new tasks. This aligns with physical AI trends prioritising simulation‑to‑real transfer, vision–action integration and multisensory perception. Parallel advances include MIT's simulation‑powered system, enabling robots to infer an object's weight and softness through handling alone, and Amazon's Vulcan, a sensor‑enhanced warehouse robot equipped with tactile grasping capabilities to manage a broader range of objects in logistics environments. Within this context, Tactile Skills stands out by combining theoretical rigour, hands‑on taxonomies and near‑perfect success rates. The framework eschews massive datasets and generic deep‑learning, instead embedding human expertise directly into robotic curriculum—emulating mastery acquisition akin to skilled trades. Looking ahead, the implications span manufacturing, healthcare, logistics and home automation. The ability to train robots rapidly on delicate physical tasks opens doors to automating activities previously deemed too nuanced for machines. Moreover, lowering technical barriers empowers smaller firms and facilities to deploy adaptable robotics at scale. Nonetheless, challenges remain. Real‑world deployment demands robust hardware, reliable sensor systems and fail‑safe protocols. Ethical considerations also surface—workforce displacement, quality control and safety monitoring require balanced oversight. Integrating tactile precision with existing robotics infrastructure may involve standardising interfaces and establishing trustworthy deployment guidelines. Academic and industry experts note that the next phase will involve generalising tactile curricula beyond initial tasks. Emerging tactile-language-action models demonstrate early promise in translating language instructions into fine‑grained physical actions—crucial for open‑ended applications. Meanwhile, meta‑learning techniques are enabling robots to 'learn to learn,' from minimal data, suggesting even greater flexibility ahead. As embodied intelligence matures, Tactile Skills signals a shift: robots will no longer rely solely on data scale, but on structured skill pedagogy. If education‑inspired frameworks replicate across platforms, robotics could finally conquer the delicate, dexterous domains that have thwarted automation—transforming industries and daily life alike.


TECHx
3 days ago
- Science
- TECHx
MBZUAI Reveals ‘Tactile Skills' Breakthrough in Robotics
Home » Emerging technologies » Artificial Intelligence » MBZUAI Reveals 'Tactile Skills' Breakthrough in Robotics Mohamed bin Zayed University for Artificial Intelligence (MBZUAI) has announced a major breakthrough in robotic automation. The research introduces a new embodied AI framework called Tactile Skills . Despite rapid progress in robotics, machines have long struggled with delicate, tactile tasks. These include inserting connectors and handling flexible materials. However, the Tactile Skills framework is now changing this landscape. Sami Haddadin, VP for Research at MBZUAI, led the project. He collaborated with his former PhD student Lars Johannsmeier, Yanan Li from the University of Sussex, and Etienne Burdet from Imperial College. Their work was published on June 23 in Nature Machine Intelligence . The researchers revealed that Tactile Skills offers a scalable, practical, and theoretically sound solution. It is inspired by the human neural system and vocational training methods. The framework uses a structured taxonomy based on expert-defined process specifications. In simple terms, it acts like a specialized curriculum for robots. It enables them to quickly learn and master new physical tasks with precision and adaptability. According to Haddadin, the new framework bridges the gap between human expertise and robotic capability. He reported that robots can now master intricate tasks reliably. This marks a significant step forward in practical automation. The system was tested across 28 industrial tasks. These included demanding activities such as plug insertion and precision cutting. Results showed nearly 100% success, even when objects were misaligned or conditions changed. Robots adapted to changes with minimal error Performance remained fast and industrial-grade Unlike traditional machine learning models, Tactile Skills does not rely on trial-and-error or massive datasets. Instead, it combines expert process knowledge with reusable tactile control components. This drastically speeds up learning and cuts energy use. One key highlight was its success in assembling a complex bottle-filling device. The researchers said this proves the framework's value for real-world manufacturing. They also noted that operators with limited robotics knowledge can deploy these systems efficiently. As a result, setup time and costs are significantly reduced. Haddadin stated that this advancement could help transform robots into adaptable, skilled assistants. He added that industries now have a viable path toward automating complex, tactile tasks. Ultimately, the team believes this innovation unlocks new possibilities for automation. They reported that Tactile Skills makes reliable robotic capabilities accessible across industries and even in homes.