logo
#

Latest news with #bigData

Higher Colleges of Technology and Saal.ai forge strategic partnership to build the UAE's next generation of AI Talent
Higher Colleges of Technology and Saal.ai forge strategic partnership to build the UAE's next generation of AI Talent

Zawya

time2 days ago

  • Business
  • Zawya

Higher Colleges of Technology and Saal.ai forge strategic partnership to build the UAE's next generation of AI Talent

The collaboration aims to prepare UAE nationals for the industry, providing them practical experience and training in artificial intelligence and big data systems. The agreement was officially signed during a ceremony held at the Higher Colleges of Technology campus. Abu Dhabi, UAE: In a major step toward advancing the UAE's innovation ecosystem and higher education landscape, the Higher Colleges of Technology (HCT), the largest higher education institution in the UAE, has signed a strategic partnership with a cognitive technology company under the Abu Dhabi Capital Group (ADCG). The collaboration aims to equip Emirati students with future-ready skills in artificial intelligence (AI) and big data technologies. The agreement was formalized during a signing ceremony held at the HCT campus, reinforcing both institutions' commitment to nurturing local talent and supporting the UAE's transformation into a knowledge-based economy. This initiative aligns with the UAE Year of Community and the nation's broader goals to accelerate innovation, empower youth, and foster impactful partnerships between academia and technology leaders. At the heart of this collaboration is the integration of DigiXT, UAE-developed enterprise-grade AI and big data platform, into HCT's academic programs. This will offer students hands-on learning experiences with real-world applications of AI and data analytics. Through this platform, students will develop practical expertise that enhances their employability, while industry partners benefit from a skilled pipeline of graduates proficient in cutting-edge technologies. His Excellency Dr. Faisal Alayyan, President and CEO of HCT, emphasized the significance of the partnership, stating: 'This initiative marks a transformative milestone in our educational journey. By embedding AI into our curricula, we are equipping Emirati youth with the future-focused skills needed to lead in a digital economy. Our partnership with ensures students gain both technical competencies and the innovative mindset to become agile problem-solvers and drivers of national progress.' Vikraman Poduval, CEO of said: 'At we believe in the transformative power of technology to drive societal progress. By equipping Emirati students with practical AI skills and collaborating with public sector entities, we are fostering innovation as well as contributing to the UAE's vision of a sustainable and knowledge-based economy.' This partnership is aligned with HCT's 2023–2028 strategic vision, which prioritizes applied learning, industry engagement, and the development of future-ready graduates. It also reinforces HCT's role in delivering high-impact academic experiences that prepare Emirati students for the demands of an evolving global workforce. More broadly, the collaboration supports the UAE's national educational agenda and the UAE National Strategy for Artificial Intelligence 2031, which aims to position the country as a global AI leader. The strategy envisions integrating AI across sectors such as education, space, technology, energy, and transportation. Further demonstrating this commitment, the UAE will introduce AI across government school curricula, from kindergarten through Grade 12, beginning next academic year. This nationwide initiative is part of a long-term strategy to prepare future generations for the digital era. Globally, the AI industry is projected to reach a market value of USD 15.7 trillion by 2030. For the UAE, AI adoption is expected to boost GDP by 35 percent (USD 96 billion) and reduce government spending by nearly USD 3 billion through greater efficiency. This partnership represents a key milestone in aligning education with national priorities and empowering the next generation of Emirati innovators and leaders. About Higher Colleges of Technology: The Higher Colleges of Technology (HCT) is the UAE's largest applied higher education institution, committed to preparing Emirati youth for the future economy through hands-on, industry-aligned learning. Established in 1988, HCT operates campuses across the Emirates, offering a wide range of programs in engineering, business, health sciences, computer information science, applied media, and education. HCT serves as a key driver of economic growth and social impact through strategic partnerships with government, industry, and academia. Its leadership team champions institutional excellence, faculty transformation, and future-ready learning to empower graduates to lead in vital sectors and contribute to the nation's prosperity. About is a prominent leader in AI-cognitive solutions, helping businesses across various industries improve operational efficiency and drive innovation. With a suite of UAE-developed products and platforms—including DigiXT, Academy X, Dataprism360, and Market Hub—SAAL offers tailored solutions designed to drive digital transformation in sectors like defence, healthcare, oil and gas, smart cities and education. a part of the Abu Dhabi Capital Group (ADCG), is dedicated to harnessing the power of AI to help organisations streamline processes, enhance decision-making, and create more meaningful, compassionate futures for all.

Data products and services are playing a new role in business.
Data products and services are playing a new role in business.

Forbes

time18-06-2025

  • Business
  • Forbes

Data products and services are playing a new role in business.

Data mechanics isn't an industry. Discussion in this space tends to gravitate around the notion of 'data science' as a more pleasing umbrella term. Perhaps borrowing half its name from the core practice of computer science (the label usually put on university studies designed to qualify software application developers who want to program), there is a constant push for development in the data mechanics and management space, even though we're now well over half a century on from arrival of the first database systems. Although data mechanics may not be an industry, it is (in various forms) a company. Data-centric cloud platform company NetApp acquired Data Mechanics back in 2021. Known as a managed platform provider for big data processing and cloud analytics, NetApp wanted Data Mechanics to capitalize on the growing interest in Apache Spark, the open source distributed processing system for big data workloads. But the story doesn't end there, NetApp sold off some of its acquisitions that work at this end of the data mechanics space to Flexera, which makes some sense as NetApp is known for its storage competencies and as the intelligent data infrastructure company, after all. Interestingly, NetApp confirmed that the divestiture of technologies at this level will often leave a residual amount of software engineering competencies (if not perhaps intellectual property in some organizations on occasions) within the teams that it still operates, so these actions have two sides to them. NetApp is now turning its focus to expanding its work with some major technology partners to provide data engineering resources for the burgeoning AI industry. This means it is working with Nvidia on its AI Data Platform reference design via the NetApp AIPod service to (the companies both hope) accelerate enterprise adoption of agentic AI. It is also now offering NetApp AIPod Mini with Intel, a joint technology designed to streamline enterprise adoption of AI inferencing - and that data for AI thought is fundamental. If there's one very strong theme surfacing in data mechanics right now, it's simple to highlight - the industry says: okay you've got data, but does your data work well for AI? As we know, AI is only as smart as what you tell it, so nobody wants garbage in, garbage out. This theme won't be going away this year and it will be explained and clarified by organizations, foundations, evangelists and community groups spanning every sub-discipline of IT from DevOps specialists to databases to ERP vendors and everybody in between. Operating as an independent business unit of Hitachi, Pentaho calls it 'data fitness' for the age of AI. The company is now focusing on expanding the capabilities of its Pentaho Data Catalog for this precise use. Essentially a data operations management service, this technology helps data scientists and developers know what and where their data is. It also helps monitor, classify and control data for analytics and compliance. "The need for strong data foundations has never been higher and customers are looking for help across a whole range of issues. They want to improve the organization of data for operations and AI. They need better visibility into the 'what and where' of data's lifecycle for quality, trust and regulations. They also want to use automation to scale management with data while also increasing time to value," said Kunju Kashalikar, product management executive at Pentaho. There's a sense of the industry wanting to provide back-end automations that shoulder the heavy infrastructure burdens associated with data wrangling on the data mechanic's workshop floor. Because organizations are now using a mix of datasets, (some custom-curated, some licenced, some anonymized, some just plain old data) they will want to know which ones they can trust at what level for different use cases. Pentaho's Kashalikar suggests that those factors are what the company's platform has been aligned for. He points to its ability to now offer machine learning enhancements for data classification (that can also cope with unstructured data) designed to improve the ability to automate and scale how data is managed for expanding data ecosystems. These tools also offer integration with model governance controls, this increases visibility into how and where models are accessing data for both appropriate use and proactive governance. The data mechanics (or data science) industry tends to use industrial factory terminology throughout its nomenclature. The idea of the data pipeline is intended to convey the 'journey' for data that starts its life in a raw and unclassified state, where it might be unstructured. The pipeline progresses through various filters that might include categorization and analytics. It might be coupled with another data pipeline in some form of join, or some of it may be threaded and channelled elsewhere. Ultimately, the data pipe reaches its endpoint, which might be an application, another data service or some form of machine-based data ingestion point. Technology vendors who lean on this term are fond of laying claim to so-called end-to-end data pipelines, it is meant to convey breadth and span. Proving that this part of the industry is far from done or static, data platform company Databricks has open sourced its core declarative extract, transform and load framework as Apache Spark Declarative Pipelines. Databricks CTO Matei Zaharia says that Spark Declarative Pipelines tackles one of the biggest challenges in data engineering, making it easy for data engineers to build and run reliable data pipelines that scale. He said end-to-end too, obviously. Spark Declarative Pipelines provide a route to defining data pipelines for both batch (i.e. overnight) and streaming ETL workloads across any Apache Spark-supported data source. That means data sources including cloud storage, message buses, change data feeds and external systems. Zaharia calls it a 'battle-tested declarative framework' for building data pipelines that works well on complex pipeline authoring, manual operations overhead and siloed batch or streaming jobs. 'Declarative pipelines hide the complexity of modern data engineering under a simple, intuitive programming model. As an engineering manager, I love the fact that my engineers can focus on what matters most to the business. It's exciting to see this level of innovation now being open sourced, making it more accessible,' said Jian Zhou, senior engineering manager for Navy Federal Credit Union. A large part of the total data mechanization process is unsurprisingly focused on AI and the way we handle large language models and the data they churn. What this could mean for data mechanics is not just new toolsets, but new workflow methodologies that treat data differently. This is the view of Ken Exner, chief product officer at search and operational intelligence company Elastic. 'What IT teams need to do to prepare data for use by an LLM is focus on the retrieval and relevance problem, not the formatting problem. That's not where the real challenge lies,' said Exner. 'LLMs are already better at interpreting raw, unstructured data than any ETL or pipeline tool. The key is getting the right private data to LLMs, at the right time… and in a way that preserves context. This goes far beyond data pipelines and traditional ETL, it requires a system that can handle both structured and unstructured data, understands real-time context, respects user permissions, and enforces enterprise-grade security. It's one that makes internal data discoverable and usable – not just clean.' For Exner, this is how organizations will successfully be able to grease the data mechanics needed to make generative AI happen. It by unlocking the value of the mountains of (often siloed) private data that they already own, that's scattered across dozens (spoiler alert, it's actually often hundreds) of enterprise software systems. As noted here, many of the mechanics playing out in data mechanics are aligned to the popularization of what the industry now agrees to call a data product. As data now becomes a more tangible 'thing' in enterprise technology alongside servers, applications and maybe even keyboards, we can consider its use as more than just information; it has become a working component on the factor floor.

Palantir Is Reportedly Building a US ‘Spy Machine.' How Should You Play PLTR Stock Here?
Palantir Is Reportedly Building a US ‘Spy Machine.' How Should You Play PLTR Stock Here?

Globe and Mail

time02-06-2025

  • Business
  • Globe and Mail

Palantir Is Reportedly Building a US ‘Spy Machine.' How Should You Play PLTR Stock Here?

Palantir (PLTR) is in focus this morning following a report that it has received a contract from the President Donald Trump administration to build what is being described in the media as a 'spy machine.' According to The New York Times, the Nasdaq-listed company is creating a database of Americans based on personal data pulled from various federal agencies, raising ethical and privacy concerns. At the time of writing, Palantir shares are up more than 100% versus their year-to-date low in April. Palantir Stock Could Benefit From 'Spy Machine' Contract Critics are citing potential for abuse among other risks tied to the surveillance infrastructure that Palantir Technologies is reportedly developing for the U.S. government as well. However, the presumably large, long-term, and high-value contract may prove a major tailwind for Palantir stock as it will likely boost the company's already fast-growing government business. In Q1, the big data analytics firm saw its US government revenue increase by another 45% on a year-over-year basis to $373 million. In short, building the said 'spy machine' for the Trump administration suggests PLTR is deeply embedded in critical government operations, which could lead to steady revenues and improved investor confidence. William Blair Reiterates Dovish View on PLTR Shares William Blair analyst Louie DiPalma remains bearish on PLTR shares even though the big data analytics firm is evidently growing its ties with the U.S. government this year. On its Q1 earnings call, Palantir said its ongoing investments in technical talent and AI production use cases will result in higher expenses in 2025, which DiPalma dubbed a big red flag for investors. Why? Because the AI stock is already trading at a massive premium even to the likes of Nvidia (NVDA). Palantir Could Tank More Than 25% From Here Other Wall Street analysts seem to agree with DiPalma's dovish view on Palantir stock, given the consensus rating on the Denver-headquartered firm currently sits at 'Hold' only. Analysts' mean target of about $94 on PLTR shares indicated potential downside of well over 25% from current levels.

Has Big Brother arrived? Inside the secretive Trump effort to centralize government data on millions of Americans
Has Big Brother arrived? Inside the secretive Trump effort to centralize government data on millions of Americans

The Independent

time31-05-2025

  • Politics
  • The Independent

Has Big Brother arrived? Inside the secretive Trump effort to centralize government data on millions of Americans

The Trump administration is reportedly leaning on an Elon Musk -allied tech company to build wide-ranging data tools pooling government information on millions of Americans and immigrants alike. The campaign has raised alarms from critics that the company could be furthering Musk's DOGE effort to vacuum up and potentially weaponize – or sell – mass amounts of sensitive personal data, particularly against vulnerable groups like immigrants and political dissidents. In March, the president signed an executive order dedicated to 'stopping waste, fraud, and abuse by eliminating information silos,' a euphemism for pooling vast stores of data on Americans under the federal government. To carry out the data effort, the administration has deepened the federal government's longstanding partnership with Palantir, a tech firm specializing in building big data applications, which was co-founded by Silicon Valley investor, GOP donor, and JD Vance mentor Peter Thiel. Since Trump took office, the administration has reportedly spent more than $113 million with Palantir through new and existing contracts, while the company is slated to begin work on a new $795 million deal with the Defense Department. Palantir is reportedly working with the administration in the Pentagon, the Department of Homeland Security, Immigration and Customs Enforcement, and the Internal Revenue Service, according to The New York Times. Within these agencies, the firm is reportedly building tools to track the movement of migrants in real time and streamline all tax data. The company is also reportedly in talks about deploying its technology at the Social Security Administration and the Department of Education, both of which have been targets of DOGE, and which store sensitive information about Americans' identities and finances. 'We act as a data processor, not a data controller,' the company insisted in response to the Times report. 'Our software and services are used under direction from the organizations that license our products. These organizations define what can and cannot be done with their data; they control the Palantir accounts in which analysis is conducted.' The Trump administration has reportedly pursued a variety of efforts to use big data to support its priorities, including social media surveillance of immigrants to detect alleged pro-terror views, and American activists who disagree wit Donal Trump's views.. Earlier this month, a group of former Palantir employees warned in an open letter that the company was 'normalizing authoritarianism under the guise of a 'revolution' led by oligarchs.' 'By supporting Trump's administration, Elon Musk's DOGE initiative, and dangerous expansions of executive power, they have abandoned their responsibility and are in violation of Palantir's Code of Conduct,' the employees wrote. Previous reporting from CNN and WIRED has described efforts at the Department of Homeland Security to build mass data tools to support tracking and surveilling undocumented immigrants, a key priority for the White House as deportations still aren't reaching levels necessary to meet Trump's promise of rapidly removing millions of people from the country. The effort has involved merging data from outside agencies like Social Security and the IRS, according to WIRED. 'They are trying to amass a huge amount of data,' a senior DHS official told the magazine. 'It has nothing to do with finding fraud or wasteful spending … They are already cross-referencing immigration with SSA and IRS, as well as voter data.' Since Trump took office, DOGE operatives, many of whom are unknown to the public nor have been vetted, have rapidly sought access to data at key agencies, including the Departments of Education and the Treasury, as well as the Social Security Administration, often over the objections of senior staff. The efforts have prompted scores of lawsuits against DOGE. At Social Security, the administration also moved thousands of living, mostly Latino undocumented immigrants into the agency's 'Death Master File' in an attempt to pressure them to leave the country. DOGE itself is reportedly under audit for its action by the Government Accountability Office, a federal watchdog. An April letter from Democrats on the House Oversight Committee warned of DOGE's 'extreme negligence and an alarmingly cavalier attitude' toward sensitive data. It claimed a whistleblower had described how 'DOGE engineers have tried to create specialized computers for themselves that simultaneously give full access to networks and databases across different agencies.' The 'whistleblower information obtained by the Committee, combined with public reporting, paints a picture of chaos at SSA [Social Security Administration] as DOGE is rapidly, haphazardly, and unlawfully working to implement changes that could disrupt Social Security payments and expose Americans' sensitive data,' the letter reads.'

Why Smart Facility Management Is The Sustainability Strategy Leaders Overlook
Why Smart Facility Management Is The Sustainability Strategy Leaders Overlook

Forbes

time29-05-2025

  • Business
  • Forbes

Why Smart Facility Management Is The Sustainability Strategy Leaders Overlook

Most corporate sustainability initiatives focus on product innovation or marketing campaigns. Yet some of the most impactful environmental gains come from an overlooked source: the very buildings where business happens. As climate concerns intensify and ESG reporting becomes mandatory in more jurisdictions, forward-thinking leaders are turning their attention to the foundations—quite literally—of their operations. 'Big data and environmental sustainability go hand in hand,' explains Michael Nichols, Executive Vice President of Enterprise Products and Solutions at R&K Solutions. 'With climate change and resource depletion becoming critical global issues, there's an urgent need for practical tools to monitor and manage our environmental impact.' Has sustainability always factored into facility management? Certainly, but primarily through the narrow lens of cost reduction. Today's approach leverages big data to transform buildings from passive assets into dynamic contributors to corporate environmental goals. Companies implementing data-driven facility management also see benefits ranging from enhanced operational resilience to strengthened stakeholder trust. Here's how leaders can leverage their physical infrastructure to drive meaningful sustainability outcomes. 1. Treat buildings as strategic assets, not cost centers. Before investing in flashy sustainability campaigns, examine the environmental impact of your current infrastructure. Buildings generate vast amounts of performance data that, when properly analyzed, reveal opportunities for significant efficiency improvements. Start by conducting a comprehensive energy audit and facility condition assessment to establish your baseline environmental footprint. Organizations often overlook the cumulative impact of seemingly minor infrastructure decisions. A report from the U.S. Department of Energy found that commercial buildings waste up to 30% of the energy they consume through inefficient operations. The first step toward improvement is understanding exactly how your facilities perform against industry benchmarks and identifying priority areas for intervention. 2. Use predictive analytics to prioritize high-impact improvements. Big data can track current performance and predict future outcomes. Sophisticated facility management systems now incorporate machine learning algorithms that can forecast equipment failures, simulate energy conservation scenarios, and quantify the potential environmental impact of different improvement strategies. The ability to model outcomes before implementation allows organizations to prioritize projects with the highest sustainability return on investment. For example, an analytics platform might reveal that upgrading the HVAC system in one location would reduce carbon emissions more significantly than installing solar panels at another, despite the latter being more visible as a sustainability initiative. 3. Align facility management with broader ESG reporting. As ESG reporting frameworks become more standardized and scrutinized, leaders need to ensure their sustainability initiatives produce measurable, verifiable results. Infrastructure improvements offer precisely this kind of concrete data point, particularly in the environmental dimension of ESG. Consider establishing a formal connection between your facility management team and sustainability officers. This collaboration ensures that infrastructure decisions support broader ESG goals and that the environmental benefits of facility improvements are properly captured in corporate sustainability reports. The reporting benefits extend beyond regulatory compliance. When infrastructure sustainability initiatives are properly documented, they provide compelling narratives for potential investors evaluating ESG performance and consumers increasingly making purchasing decisions based on corporate environmental responsibility. For multinational organizations, facility management data can help standardize sustainability practices across diverse regulatory environments. While sustainability requirements vary globally, a data-driven approach to infrastructure management creates consistent internal benchmarks that often exceed minimum compliance thresholds in any jurisdiction. 4. Embrace the Infrastructure-as-a-Service revolution. The emergence of 'smart building' technologies and Infrastructure-as-a-Service models is democratizing access to sophisticated facility management capabilities. These solutions enable organizations to implement advanced sustainability features without massive capital investments in proprietary systems. Cloud-based facility management platforms allow for continuous improvement rather than point-in-time upgrades. As sustainability standards evolve and technologies advance, these systems can adapt through regular software updates rather than unsustainable wholesale replacements. The integration of Internet of Things (IoT) sensors throughout facilities creates unprecedented visibility into resource consumption and environmental conditions. From water usage monitoring to occupancy-based lighting and climate control, these technologies automate efficiency in ways that were impossible even five years ago. These advancements particularly benefit organizations with aging infrastructure. Rather than replacing entire buildings, targeted technological upgrades can dramatically improve the sustainability profile of existing facilities. The key is identifying which improvements deliver the greatest environmental benefit relative to investment. It's easy to think that sustainability requires massive infrastructural overhauls or cutting-edge technologies. The reality is more nuanced: meaningful environmental improvements often come from better management of existing assets, informed by better data. By embracing this perspective, business leaders can transform their facilities from environmental liabilities into powerful drivers of their sustainability strategy and discover that what's good for the planet is also good for long-term business value.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store