Latest news with #GalNaor


Techday NZ
5 days ago
- Business
- Techday NZ
AI Appreciation Day spotlights responsible & purposeful adoption
Artificial Intelligence Appreciation Day is prompting industry leaders to reflect on the rapid progress of AI, its real-world impact, and the challenges that accompany widespread adoption across various sectors. As businesses and governments integrate AI into operations, the conversation has shifted from novelty to necessity, with a focus on strategic, responsible use and measurable outcomes. Within enterprise technology, the role of AI is evolving from an automation tool to the backbone of digital transformation. Gal Naor, CEO of StorONE, highlights AI's transformative influence in data storage, noting, "AI-powered auto tiering... observes how data is used and moves it between flash and disk tiers based on actual workload behaviour. This ensures frequently accessed data remains on high-performance storage, while infrequently used data is shifted to lower-cost media without affecting application performance." Naor emphasises that this capability both simplifies operations and prepares organisations for ever-increasing data demands. In the traditionally manual construction sector, Shanthi Rajan, CEO of Linarc, points to AI as a catalyst for addressing systemic industry challenges. "AI does not replace construction professionals; it empowers them," she explains, citing improved decision-making, reduced friction, and the introduction of contextual awareness to complex projects. According to Rajan, AI brings "cohesion to complexity, accountability to action, and momentum to teams," making construction smarter and more human-centred. Manufacturing and logistics are also benefiting from AI at the edge. "Edge AI is playing a massive role in enabling autonomous systems to make independent, real-time decisions with minimal human intervention," observes Yoram Novick, CEO of Zadara. "From self-driving cars navigating complex environments to smart factories optimising production processes, Edge AI is now delivering localised intelligence that operates well even where network connectivity is limited." Such autonomy reduces reliance on cloud connectivity and improves operational efficiency across various industries. For Australian businesses, Carla Ramchand, CEO of Avanade Australia, describes a surge of AI investment in the mid-market. "Our latest research shows that 86% of Australian mid-market leaders are increasing their investment in AI, with most expecting a fourfold return on investment in the next 12 months." Ramchand highlights the rise of agentic AI, where systems act independently, stressing that success "depends on modern infrastructure, clean data, trusted governance, and human oversight." Data remains central to AI's promise, as noted by Oded Nagel, CEO of CTERA. "In a ready state, data becomes the fuel for AI systems, enhancing their ability to produce actionable insights and drive strategic decisions. Companies must prioritise having their data organised and accessible, as it is the key to unlocking AI's transformative potential." Security is another major concern, as cyber threats continue to grow in sophistication. Jimmy Mesta, CTO of RAD Security, points out, "AI is now actually the only way teams can keep up... AI can spot patterns, connect events across multiple parts of the security stack, and take action fast enough to matter." Drew Bongiovanni, Technical Marketing Manager at Index Engines, adds that "the real ROI of AI shows up after the breach. It's in the speed of recovery, the confidence in your backups, and the ability to make decisions under pressure without second-guessing your data." AI enhances the complexity of enterprise application development, but with important caveats. Vijay Prasanna Pullur, CEO of WaveMaker, cautions, "Injecting AI into the design-to-code-to-deploy process without oversight or curation may not work... Enterprise applications and solutions are complex and need a lot more enablement on top of existing AI orchestration." With growing reliance on AI comes a call for responsible governance. Josh Mason, CTO of RecordPoint, argues that businesses must "make sure [they're] governing [their] data and using the technology responsibly and ethically, in a way that benefits customers and employees." According to Mason, poor data governance is a key blocker to large-scale AI deployment, with only a minority of companies succeeding beyond pilot implementations. Sustainability and infrastructure are increasingly seen as critical to AI's continued growth. Ted Oade of Spectra Logic urges the industry to "champion responsible development: transparency, bias mitigation, and environmental impact. Appreciating AI means understanding its full context - technical, operational, and ethical." Mark Klarzynski, CEO of PEAK:AIO, concurs, arguing that "the need for AI-native infrastructure is no longer optional. It is strategic." As AI systems become more agentic, autonomy and insight are set to become defining characteristics. Helen Masters, Managing Director at Smartsheet, sums up the current landscape: "Today's conversation focuses on how effectively we can integrate AI into our everyday lives. Across Australia, businesses are rapidly adopting AI not as a standalone solution, but as a strategic enabler." David Hunter, CEO of Local Falcon, offers a final reflection: "AI's true power isn't in integrating it into existing tools for writing fluffy content faster... but in uncovering patterns, trends, and other insights that would otherwise go unnoticed. The future isn't 'AI everywhere.' It's AI with purpose." As AI Appreciation Day is marked, industry consensus is clear: intelligent, responsible, and sustainable integration of AI will shape the future across every sector, provided organisations invest in governance, infrastructure, and purposeful deployment.


Techday NZ
20-06-2025
- Business
- Techday NZ
StorONE launches ONEai for on-premises AI training & analysis
StorONE has introduced ONEai, an enterprise-focused artificial intelligence solution designed to enable large language model (LLM) training and inferencing directly within the storage layer. Developed in partnership with Phison Electronics, ONEai integrates Phison's aiDAPTIV+ technology directly into StorONE's storage platform. This arrangement means companies can carry out AI-related operations, including domain-specific model training and inferencing, on data stored locally—without the need for external AI infrastructure or cloud services. Technology integration ONEai offers an enterprise storage system with embedded AI processing capabilities. By leveraging GPU and memory optimisation, intelligent data placement, and direct support for LLM fine-tuning, the new solution aims to streamline AI deployment and improve access to proprietary data analytics. The system is positioned as a turnkey, plug-and-play deployment that does not require significant internal AI expertise or complex infrastructure. The solution's architecture is focused on reducing hardware costs, improving GPU performance, and offering on-premises LLM training and inferencing. This is intended to support organisations looking to gain deeper insights from their stored data while controlling costs and maintaining data sovereignty. Addressing enterprise challenges With more enterprises seeking to leverage AI on multi-terabyte and petabyte-scale data pools, the traditional requirement for separate, often complex AI infrastructure has been a significant barrier. Conventional methods often depend on external orchestration and cloud or hybrid AI workflows, which can increase both regulatory risks and total costs for data-driven organisations. StorONE and Phison have developed ONEai to deliver fully automated, AI-native LLM training and inference directly within the storage layer itself. The product supports real-time insights into file creation, modification, and deletion, and is optimised for fine-tuning, retrieval-augmented generation (RAG), and inferencing tasks. The system includes integrated GPU memory extensions, offering a user interface aimed at simplifying ongoing management. End-to-end automation "ONEai sets a new benchmark for an increasingly AI-integrated industry, where storage is the launchpad to take data from a static component to a dynamic application," said Gal Naor, CEO of StorONE. "Through this technology partnership with Phison, we are filling the gap between traditional storage and AI infrastructure by delivering a turnkey, automated solution that simplifies AI data insights for organizations with limited budgets or expertise. We're lowering the barrier to entry to enable enterprises of all sizes to tap into AI-driven intelligence without the requirement of building large-scale AI environments or sending data to the cloud." ONEai is presented as a system that automatically recognises and responds to changes in stored data, supporting immediate, ongoing AI analysis. Its plug-and-play approach is designed to remove the requirement for separate AI platforms and deliver full on-premises processing to ensure maximum data control. Phison's technical contribution "We're proud to partner with StorONE to enable a first-of-its-kind solution that addresses challenges in access to expanded GPU memory, high-performance inferencing and larger capacity LLM training without the need for external infrastructure," said Michael Wu, GM and President of Phison US. "Through the aiDAPTIV+ integration, ONEai connects the storage engine and the AI acceleration layer, ensuring optimal data flow, intelligent workload orchestration and highly efficient GPU utilization. The result is an alternative to the DIY approach for IT and infrastructure teams, who can now opt for a pre-integrated, seamless, secure and efficient AI deployment within the enterprise infrastructure." According to the companies, ONEai's plug-and-play deployment model can eliminate the requirement for in-house AI expertise while streamlining overall operations. The integrated GPU modules inside the storage layer aim to lower AI inference latency and deliver up to 95% hardware utilisation, while also minimising power consumption and operational costs by reducing the number of GPUs required. Use cases and availability ONEai is designed for immediate interaction with proprietary data, automatically tracking data changes and feeding updates into ongoing AI training and inferencing processes. This is intended to align with real-world enterprise needs for rapid, domain-specific data analysis. The solution will become generally available in the third quarter of 2025.


Techday NZ
25-04-2025
- Business
- Techday NZ
StorONE and Phison expand partnership, launch AI storage platform
StorONE has announced an expansion of its strategic partnership with Phison Electronics to bring an AI-native intelligent on-premises storage platform to market, available in the second quarter of 2025. The collaboration pairs StorONE's storage platform with Phison's aiDAPTIV+ memory extension technology to address the requirements of large language model (LLM) training and to allow natural-language access to storage management for enterprise and research users. The enhanced partnership builds on previous joint efforts in providing high-performance and high-density storage capacities. The two companies now shift focus to delivering AI-native, on-premises infrastructure designed for secure LLM training and accelerated storage management with conversational capabilities. According to the announcement, the forthcoming platform aims to support scalable LLM training and AI embedding acceleration. This is intended to help enterprises and research organisations manage data privacy, optimise workflows, and improve inferencing latency using a combination of high-throughput storage and AI-driven interfaces. StorONE's Chief Executive Officer, Gal Naor, said: "At StorONE, we have always believed in making a storage platform not just fast and affordable, but intelligent. By combining our enterprise storage platform with Phison's aiDAPTIV+, we are enabling AI infrastructure that's not just smarter, but more accessible." Phison's aiDAPTIV+ leverages flash-based SSD memory to extend GPU VRAM capacity, allowing organisations to train large AI models that exceed billions of parameters without relying solely on extensive GPU hardware deployments in data centres. Phison states that this approach enables more cost-effective LLM training by overcoming GPU memory limitations while maintaining data locality and security. Michael Wu, President of Phison-US, said: "Phison is committed to democratizing AI performance - and our partnership with StorONE expands that vision at scale. Together, we're giving customers a powerful way to accelerate their AI ambitions with secure, intelligent on-premises infrastructure." The storage platform will also include an AI-powered chatbot interface, designed to enable administrators and developers to interact with the system using natural-language queries. This feature aims to simplify management tasks, such as querying system status, diagnosing performance bottlenecks, and optimising resource workflows. The companies stated that the joint platform is intended to be available from the second quarter of 2025, targeting enterprise and research customers requiring on-premises AI training capabilities and natural language-accessible storage control. Phison Electronics has a background in NAND flash controller and storage technology, with a portfolio including the aiDAPTIV+ solution and enterprise solid-state drives built for data-intensive workloads, AI applications, and cloud environments. StorONE develops a software-based enterprise storage platform compatible with various disk types and protocols, available both on-premises and in the cloud, with an emphasis on flexibility and future support for new technologies.