logo
#

Latest news with #real-time

Bitcoin's Mysterious Creator Is (Almost) the World's 10th Richest Person
Bitcoin's Mysterious Creator Is (Almost) the World's 10th Richest Person

Yahoo

time14-07-2025

  • Business
  • Yahoo

Bitcoin's Mysterious Creator Is (Almost) the World's 10th Richest Person

Bitcoin's pseudonymous creator, Satoshi Nakamoto, is now among the wealthiest individuals — or group of individuals — on the planet, without ever moving a single dollar of their fortune or revealing any identifying details about themselves. With BTC climbing above $122,000 on Monday, Satoshi's estimated 1.1 million coins are worth over $134 billion, according to public blockchain data. That would place them just outside the global top 10 richest people, ahead of names like Dell Computers CEO Michael Dell and Walmart heir Rob Walton, and inching closer to former Microsoft CEO Steve Ballmer and legendary investor Warren Buffett according to the Forbes real-time billionaires stash is within touching distance of Google co-founder Sergey Brin, who has as estimated net worth of $142 billion. Satoshi's wallet, which got all its holdings from mining the network in its earliest days, has remained untouched since 2010. The network was still able to be run on a few laptops at the time. None of the BTC has ever moved, sparking endless speculation about whether Nakamoto is dead, missing or simply committed to never interfering with the project again. Unlike most billionaires, Satoshi didn't build a company, pitch to VCs, or list anything on the stock market. And 16 years later, that quiet drop has helped spawn a $2.4 trillion network at current valuations. Bitcoin notched a new all-time high this week, buoyed by renewed ETF inflows, inflation-hedge narratives, and persistent demand from institutions. While Satoshi's fortune is theoretical — as none of it has been sold or verified as accessible — the valuation highlights just how far crypto has come since Satoshi's final forum post in 2011.

Santos to deploy Xecta's IPSM solution at Eastern Australia and PNG assets
Santos to deploy Xecta's IPSM solution at Eastern Australia and PNG assets

Yahoo

time11-07-2025

  • Business
  • Yahoo

Santos to deploy Xecta's IPSM solution at Eastern Australia and PNG assets

Energy company Santos has agreed to deploy Xecta's integrated production system model (IPSM) across its assets in Eastern Australia and Papua New Guinea (PNG). This digital oilfield project expansion marks a significant step in the application of real-time production optimisation technology. The five-year agreement builds on the success of the IPSM's initial deployment in the Cooper Basin, which has already resulted in a measurable increase in production. The project will now extend to include Santos' Coal Seam Gas and PNG operations. Developed by Xecta, the IPSM is claimed to be an industry-first solution that provides comprehensive insights for production optimisation. By analysing billions of telemetry data points and automating engineering workflows, the platform delivers insights that traditional tools and manual processes cannot replicate. The Cooper Basin deployment covered more than 1,000 wells and ten satellite facilities. In one of Australia's most challenging operational environments, the IPSM has enabled Santos to modernise surveillance and optimisation workflows, leading to significant reductions in the time and effort required by engineering teams. Xecta CEO Sanjay Paranji said: 'IPSM represents a fundamental shift in how production systems are understood and optimised. 'By combining domain physics with AI, IPSM enables a continuously calibrated view of field performance at scale—automating surveillance, surfacing optimisation opportunities, and dramatically reducing manual effort in even the most complex environments.' In a different development, Santos has secured a mid-term contract with QatarEnergy Trading to provide approximately 500,000tpa of LNG for two years, commencing in 2026. The LNG for this agreement will be procured from Santos' collection of resources and will be delivered on an ex-ship basis. "Santos to deploy Xecta's IPSM solution at Eastern Australia and PNG assets" was originally created and published by Offshore Technology, a GlobalData owned brand. The information on this site has been included in good faith for general informational purposes only. It is not intended to amount to advice on which you should rely, and we give no representation, warranty or guarantee, whether express or implied as to its accuracy or completeness. You must obtain professional or specialist advice before taking, or refraining from, any action on the basis of the content on our site.

Deutsche Bank goes live with Swift Instant Cash Reporting
Deutsche Bank goes live with Swift Instant Cash Reporting

Finextra

time10-07-2025

  • Business
  • Finextra

Deutsche Bank goes live with Swift Instant Cash Reporting

Deutsche Bank has gone live with Swift's new API-driven Instant Cash Reporting (ICR) tool for accessing real-time account and balance information with a single, standardised connection. 0 As part of Deutsche Bank's expanding API capabilities, ICR delivers immediate, on-demand financial data access to corporate clients through the Swift infrastructure. Spain-based energy firm Iberdrola is first client to implement ICR in its treasury. Through ICR the bank's clients can collect real-time account and balance data via a single access point using the ISO 20022 data model and secure JSON format. Swift acts as the central connector, routing API pull requests from corporates to Deutsche Bank. Deutsche Bank responds with standardised account data in JSON format, tailored to the corporate's selected accounts or full account set, based on the associated Swift BIC. Johnny Grimes, head, corporate ash product, Deutsche Bank, says: 'ICR addresses the key demand of corporates for multi bank solutions and consistent standards in the API space to simplify adoption.' ICR is currently accessible to Swift-connected corporates and financial institutions. Deutsche Bank says it welcomes other banks and corporates joining initiative to support scaling up the usage of multi-bank API solutions.

Building The Backbone Of Intelligent Automation
Building The Backbone Of Intelligent Automation

Forbes

time07-07-2025

  • Business
  • Forbes

Building The Backbone Of Intelligent Automation

Shinoy Vengaramkode Bhaskaran, Senior Big Data Engineering Manager, Zoom Communications Inc. As AI agents become more intelligent, autonomous and pervasive across industries—from predictive customer support to automated infrastructure management—their performance hinges on a single foundational capability: real-time, context-rich data. These agents aren't traditional analytics consumers—they're dynamic systems that require timely, reliable and actionable data streams to sense, reason and respond. What powers this intelligence isn't just the model but the pipeline behind it. The AI Agent Era And Its Data Dependency AI agents today operate in environments that demand not only accuracy but also speed and adaptability. From autonomous fraud detection in fintech to personalized content delivery in media, these agents must process data from diverse sources and react to constantly evolving inputs. Traditional extract, transform and load (ETL) systems—built for periodic batch jobs—fall short in these contexts. Instead, modern pipelines need to be event-driven, modular and responsive to real-time stimuli. And although the tools used may vary, the architectural principles remain largely consistent. Redefining Data Pipelines: From Static Flow To Intelligent Feedback Modern data pipelines for AI agents aren't just conduits for raw data—they're dynamic systems that enable feature extraction, enrichment, inference and feedback loops. These pipelines must evolve beyond rigid batch jobs toward flexible, scalable flows that support high-volume and low-latency requirements. Frameworks such as Apache Spark and Apache Flink can be considered for processing, depending on whether the task is batch-heavy or stream-driven. Spark, known for its in-memory execution, is often explored for machine learning pipelines, while Flink's fine-grained, event-time processing is suited for real-time, low-latency use cases. However, these aren't prescriptive choices—organizations may adopt other solutions based on operational constraints, data gravity or ecosystem compatibility. A High-Level Reference Architecture For AI-Ready Data Pipelines Although the specific implementation stack will vary, a high-level design often includes the following components, with several tooling options available at each stage: Organizations may opt for technologies like Apache Kafka or Amazon Kinesis to manage high-throughput event streams. These platforms are commonly used to decouple producers and consumers and allow for scalable, fault-tolerant ingestion, but are just one class of tools among several available for event-driven architectures. For distributed data transformation, tools such as Apache Spark (in batch mode) or Apache Flink (for streaming scenarios) are frequently evaluated. Both integrate well with modern data ecosystems and support flexible processing patterns. However, alternatives such as AWS Glue, Dataflow or even lightweight serverless functions may also be considered based on scale and latency requirements. Event-driven use cases—especially those requiring per-event decisions, like anomaly detection or real-time personalization—might benefit from stream processors. Flink, Kafka Streams or similar tools are often mentioned in this context, but the best choice depends on the complexity of state management, processing guarantees and integration needs. For deploying and invoking machine learning (ML) models, services such as AWS SageMaker, Kubernetes-based ML serving frameworks or even custom containerized APIs are options many enterprises explore. These systems often feed predictions back into the pipeline for real-time adaptation, closing the loop for continuous learning. Long-term and operational storage may leverage services like Amazon S3, HDFS or hybrid lakehouse architectures. These enable both analytics and machine learning workloads, but selection is typically based on governance needs, query latency and cost profiles. Workflow engines such as Apache Airflow, Argo or AWS Step Functions are often evaluated for pipeline coordination. For observability, teams may use Prometheus, CloudWatch or Grafana, although the tooling depends heavily on existing infrastructure and compliance requirements. Key Design Considerations Rather than fixating on specific tools, architecture teams should focus on four design principles: 1. Latency Tolerance: Choose stream versus batch processing based on how fast the agent must act. Low-latency use cases (e.g., trading and fraud detection) will shape your stack differently than hourly insights. 2. Scalability And Resilience: Platforms like Kubernetes or cloud-native autoscaling services provide elasticity but require thoughtful cost modeling and failover planning. 3. Modularity: AI pipelines should be loosely coupled to allow parts—such as model serving or feature engineering—to evolve independently. 4. Security And Compliance: Role-based access control, encryption in motion and at rest and audit logs should be integral from day one, regardless of stack. MLOps And Agent Evolution AI agents don't just infer—they evolve. To support this, pipelines must accommodate continuous training, automated deployments and model monitoring. Tools such as SageMaker Pipelines, Kubeflow or Spark MLlib can help with MLOps integration—but again, these are options, not universal solutions. The more autonomous the agent, the more critical the infrastructure behind it becomes. Feedback loops must be engineered into the pipeline, allowing the system to improve with each interaction and data point. A Real-World Illustration Imagine an industrial AI agent tasked with predictive maintenance. It might ingest real-time sensor data through Kafka, process signal anomalies using Flink (or an equivalent stream processor) and trigger predictions via a model hosted on Kubernetes or SageMaker. Inference results are logged to Amazon S3 or a data lakehouse for retraining. Each of these elements could be substituted based on organizational preferences, maturity and workload profiles. The Road Ahead The infrastructure behind AI agents isn't a static diagram—it's a living, evolving system. Designing effective data pipelines means embracing change, modularity and flexibility. Rather than betting on specific tools, organizations should focus on building architectures that are tool-agnostic, standards-aligned and use-case driven. That's how we future-proof AI systems—and unlock their full potential. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?

Lemnisk Unveils Industry-First Innovations for the AI Era of Customer Engagement
Lemnisk Unveils Industry-First Innovations for the AI Era of Customer Engagement

Globe and Mail

time24-06-2025

  • Business
  • Globe and Mail

Lemnisk Unveils Industry-First Innovations for the AI Era of Customer Engagement

New AI-driven features include Real-Time Predictive Scoring, Entity-Level Identity Resolution, Voice-to-CDP processing, and Model Context Protocol compliance Lemnisk, a leading enterprise Customer Data Platform (CDP) and marketing technology company, today introduced a suite of AI innovations that mark a significant leap forward in real-time, personalized customer engagement. Trusted for its enterprise-grade security and compliance, Lemnisk's enhanced platform now introduces advanced capabilities designed to power the next era of intelligent, AI-driven marketing automation. Real-Time, Event-Driven Predictive Scoring: Go beyond static segmentation. Lemnisk now categorizes audiences in real time based on their immediate purchase likelihood or churn probability. This enables brands to proactively engage customers with timely retention and conversion strategies triggered by their actual behavior. Entity-Level Identity Resolution: Move beyond the 'one-size-fits-all' customer profiles. Now, you can unify customer intelligence across business lines - credit cards, loans, investments, and more - while still executing campaigns at the individual product level. Voice to CDP: Feed in contact center recordings to auto-transcribe, extract sentiment and topic insights, and seamlessly feed them into the CDP as real-time segmentation signals. With automated clustering and no manual tagging required, this feature powers real-time personalization for voice journeys. MCP Compliance for AI Agents: Lemnisk CDP is now MCP (Model Context Protocol) compliant via Lemnisk's external API and MCP Server Integration framework. This enables brand agents to securely access contextual enterprise data in real time and complete transactions inside conversations. ' The exploding AI landscape demands foundational changes in how enterprises understand & engage with customers. Traditional models of real-time responsiveness are being disrupted with agentic AI, ' said Subra Krishnan, CEO of Lemnisk. ' Our latest innovations reflect a bold step forward, empowering marketers to anticipate needs, personalize at scale, and show up for customers in the exact moments that drive loyalty and growth. Just as importantly, we're future-proofing our platform to ensure enterprises stay ahead in an AI-first world. ' These AI-native capabilities are now generally available to all Lemnisk customers. To fully leverage the power of real-time intelligence and next-gen personalization, Lemnisk recommends that customers on earlier versions migrate to the latest release of the CDP. For more information, visit About Lemnisk Lemnisk is an AI-powered Customer Data Platform and Real-time marketing automation solution that enables enterprises to drive higher conversions, retention, and customer lifetime value through personalized, data-driven engagement. The key capabilities include: Founded by Subra Krishnan, Rinku Ghosh, and Praveen D.S, Lemnisk is headquartered in Bangalore and has offices in Singapore, Dubai, and Boston. Media Contact Company Name: Lemnisk Contact Person: Divya AN Email: Send Email Country: India Website:

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store