logo
The US has just exposed the green industry's dirty little secret

The US has just exposed the green industry's dirty little secret

Telegraph3 days ago
The cat is out of the bag. Electricity made from renewable sources is not as 'cheap' as its advocates sometimes claim. It evidently cannot survive without billions annually in tax credits.
That's the message from the latest skirmish over America's renewable energy future, where the House and Senate have unveiled duelling visions for the rollback of energy tax credits – each with its own tempo and tone. The vitriolic reaction from the green lobby, and the predictions of disaster for renewables should any of these changes be passed into law, have exposed just how economically unsustainable even the fiercest backers of these energy sources clearly accept them to be.
Supporters of renewable energy have assured us for years that the wind blows and the sun shines free of charge. But although these technologies have received hundreds of billions in subsidies globally over the past 20 years, proponents still demand more – for a few years, we're told, until renewables can stand on their own feet.
Senate Minority Leader Chuck Schumer said: 'Eliminating these tax credits radically and irresponsibly rolls back all the progress we have made in recent years. It turns America's clean energy boom into a bust.'
But the boom was always something of an illusion. It is often asserted that electricity in the United States made with wind and solar is less expensive than electricity made by natural gas and coal. But rather than declining, average American electricity prices have risen considerably over the past 20 years as wind and solar have entered the electricity mix.
One dirty little secret is that, on a state-by-state basis, nine out of the top 10 states in electricity prices in the United States in 2024 required renewable energy as part of their electricity mix. The bottom 10 states generally did not require renewable energy.
It can cost utility companies more to provide people with electricity using intermittent sources than continuous sources such as natural gas, coal, and nuclear power. The utility company is likely to need to put other energy sources in place, to provide back-up should demand not be met when the wind doesn't blow and the sun doesn't shine.
For instance, when the wind stops, an alternative such as a natural gas power plant will likely need to be turned on to meet demand. Then it's turned off when the wind starts. With America's low natural gas prices, it is always likely to be cheaper to have one set of equipment and to operate one power plant continuously, rather than having it sit idle as the wind blows.
Taxpayers are paying multiple times for renewables. In their electricity bills, they pay not only for wind and solar, but for the backups to the wind and solar. In their tax bills, they pay for the energy tax credits. They also give up faster economic growth when electricity prices rise.
Another dirty secret is that renewable energy is often neither green nor clean. About 70 per cent of solar panels, wind turbines, batteries and their components are made in China, which remains reliant on coal-fired power plants to fuel its industries. Wind turbines kill birds, and, when offshore, can harm sea mammals. Solar power can take over agricultural land, which is likely to drive up the price of food. Green and clean are marketing hype used to push renewables onto unsuspecting consumers.
While both chambers agree on tightening the purse strings by reducing tax credits, the House opts for a cliff-edge approach, while the Senate favours a more gradual wind-down.
The House draws a hard line at Dec 31, 2025. From clean vehicles to home energy upgrades, nearly all credits vanish at the stroke of midnight. Even the clean hydrogen and nuclear incentives face sharp cut-offs, with added restrictions on foreign influence. Transferability of credits? Many are axed. The message is clear: the era of generous subsidies is fast ending.
The Senate, by contrast, offers a more calibrated exit. Clean vehicle credits expire by Sep 30, 2025, but major production and investment credits are phased out over years, some as late as 2036. The Senate also tightens rules on foreign entities, but with more nuanced thresholds and timelines.
Both bills close ranks on national security. Credits are denied to entities with ties to China, Russia, and other adversaries. The clean hydrogen credit in the House bill expires at the end of this year, but in the Senate bill by the end of 2027. Carbon capture faces identical construction cut-offs and foreign ownership bans. But only the House repeals credit transferability, an investor-friendly feature the Senate preserves.
With the end of these tax credits, Americans may well discover that the true costs of renewable energy are higher than utility companies are willing to bear. Developers are already saying that they will halt projects without the tax credits.
If the age of renewable energy tax credits is drawing to a close, Americans will be the beneficiaries. The question is how abruptly Washington will pull the plug – and whether other countries will follow.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Inside Barron Trump's outrageously lavish childhood as president's son bags his first girlfriend
Inside Barron Trump's outrageously lavish childhood as president's son bags his first girlfriend

Daily Mail​

time20 minutes ago

  • Daily Mail​

Inside Barron Trump's outrageously lavish childhood as president's son bags his first girlfriend

Barron Trump was spoiled with expensive tuition and lived in a home fit for a king during his lavish childhood. The 19-year-old student, who is the only child of President Donald Trump and his wife Melania Trump, was born on March 20, 2006, and has remained largely out of the spotlight. But as reports emerge that the teen has bagged himself his first ever girlfriend, Femail delves into Barron's luxurious upbringing, taking a particularly close look at the hundreds of thousands spent on his education. Barron was born and raised in New York City and spent much of his childhood living at his dad's extravagant penthouse apartment, Trump Tower, in Manhattan. The sprawling home, worth an estimated $65 million, boasts of breathtaking views of Central Park and the Manhattan skyline. According to reports, Barron has a floor of his own in the lavish residence, with the home located on the top three floors of Trump Tower. Barron attended school in New York City until 2017, when his dad took office for his first term in the White House. He then attended middle and high schools in different locations across the country. For his primary education, Barron went to Manhattan's prestigious Columbia Grammar and Preparatory School. Tuition at the elite school is billed at $64,340 per year and increases by about $1,000 once you enter high school. Trump started his first term as president halfway through the school year in January 2017, so Barron remained with Melania in New York to finish up his time at Columbia. In September 2017, he started sixth grade at St. Andrew's Episcopal in Potomac, Maryland - just about a 45-minute car drive from Washington, D.C. The yearly middle school tuition at St. Andrew's cost $52,290. Once Trump's first term was up in 2020, Barron moved to Palm Beach, Florida and finished school there, at Oxbridge Academy in West Palm Beach. He graduated from Oxbridge Academy in Palm Beach, Florida, last May. Barron is currently a student at NYU and has just completed his freshman year. Melania and Baron attend the annual playground party with New York families to support Central Park's playgrounds on May 30, 2007 He has returned to live at Trump Tower while he completes his college education. President Trump revealed last month his 'formula for good parenting.' 'I always said the same thing. I said: no drugs, no alcohol, no cigarettes. I also would say don't get tattoos, but I don't say it too strongly, because a lot of people have gotten tattoos, and that's what they choose to do,' he told The New York Post in a podcast interview. Many of the president's supporters at rallies sport tattoos as do some members of Trump's Cabinet, including Defense Secretary Pete Hegseth. The president has made a point of not drinking, as his older brother, Fred Trump Jr., suffered from alcoholism and died young, at age 42. He boasted that his five children - Donald Trump Jr., Ivanka, Eric, Tiffany and Barron - were 'born smart.' 'Barron is great. He is very tall and good,' he added.

Ingram Micro says identified ransomware on certain of its internal systems
Ingram Micro says identified ransomware on certain of its internal systems

Reuters

time31 minutes ago

  • Reuters

Ingram Micro says identified ransomware on certain of its internal systems

July 5 (Reuters) - Ingram Micro (INGM.N), opens new tab said on Saturday it recently identified ransomware on certain of its internal systems. The information technology company took steps to secure the relevant environment, including taking certain systems offline, it said in a statement. The Irvine, California-based company also launched an investigation with the assistance of leading cybersecurity experts and notified law enforcement, it added.

Context Engineering for Financial Services: By Steve Wilcockson
Context Engineering for Financial Services: By Steve Wilcockson

Finextra

timean hour ago

  • Finextra

Context Engineering for Financial Services: By Steve Wilcockson

The hottest discussion in AI right now, at least the one not about Agentic AI, is about how "context engineering" is more important than prompt engineering, how you give AI the data and information it needs to make decisions, and it cannot (and must not) be a solely technical function. "'Context' is actually how your company operates; the ideal versions of your reports, documents & processes that the AI can use as a model; the tone & voice of your organization. It is a cross-functional problem.' So says renowned Tech Influencer and Associate Professor at Wharton School, Ethan Molick. He in turn cites fellow Tech Influencer Andrej Karpathy on X, who in turn cites Tobi Lutke, CEO of Shopify: "It describes the core skill better: the art of providing all the context for the task to be plausibly solvable by the LLM. " The three together - Molick, Karpathy and Lutke - make for a powerful triumvirate of Tech-influencers. Karpathy consolidates the subject nicely. He emphasizes that in real-world, industrial-strength LLM applications, the challenge entails filling the model's context window with just the right mix of information. He thinks about context engineering as both a science—because it involves structured systems and system-level thinking, data pipelines, and optimization —and an art, because it requires intuition about how LLMs interpret and prioritize information. His analysis reflects two of my predictions for 2025 one highlighting the increasing impact of uncertainty and another a growing appreciation of knowledge. Tech mortals offered further useful comments on the threads, two of my favorites being: 'Owning knowledge no longer sets anyone apart; what matters is pattern literacy—the ability to frame a goal, spot exactly what you don't know, and pull in just the right strands of information while an AI loom weaves those strands into coherent solutions.' weaves those strands into coherent solutions.' 'It also feels like 'leadership' Tobi. How to give enough information, goal and then empower.' I love the AI loom analogy, in part because it corresponds with one of my favorite data descriptors, the "Contextual Fabric". I like the leadership positivity too, because the AI looms and contextual fabrics, are led by and empowered by humanity. Here's my spin, to take or leave. Knowledge, based on data, isn't singular, it's contingent, contextual. Knowledge and thus the contextual fabric of data on which it is embedded is ever changing, constantly shifting, dependent on situations and needs. I believe knowledge is shaped by who speaks, who listens, and what about. That is, to a large extent, led by power and the powerful. Whether in Latin, science, religious education, finance and now AI, what counts as 'truth' is often a function of who gets to tell the story. It's not just about what you know, but how, why, and where you know it, and who told you it. But of course it's not that simple; agency matters - the peasant can become an abbot, the council house schoolgirl can become a Nobel prize-winning scientist, a frontier barbarian can become a Roman emperor. For AI, truth to power is held by the big tech firms and grounded on bias, but on the other it's democratizing in that all of us and our experiences help train and ground AI, in theory at least. I digress. For AI-informed decision intelligence, context will likely be the new computation that makes GenAI tooling more useful than simply being an oft-hallucinating stochastic parrot, while enhancing traditional AI - predictive machine learning, for example - to be increasingly relevant and affordable for the enterprise. Context Engineering for FinTech Context engineering—the art of shaping the data, metadata, and relationships that feed AI—may become the most critical discipline in tech. This is like gold for those of us in the FinTech data engineering space, because we're the dudes helping you create your own context. I'll explore how five different contextual approaches, all representing data engineering-relevant vendors I have worked for —technical computing, vector-based, time-series, graph and geospatial platforms—can support context engineering. Parameterizing with Technical Computing Technical computing tools – think R, Julia, MATLAB and Python's SciPy stack - can integrate domain-specific data directly into the model's environment through structured inputs, simulations, and real-time sensor data, normally as vectors, tables or matrices. For example, in engineering or robotics applications, an AI model can be fed with contextual information such as system dynamics, environmental parameters, or control constraints. Thus the model can make decisions that are not just statistically sound but also physically meaningful within the modeled system. They can dynamically update the context window of an AI model, for example in scenarios like predictive maintenance or adaptive control, where AI must continuously adapt to new data. By embedding contextual cues, like historical trends, operational thresholds, or user-defined rules, such tools help ground the model's outputs in the specific realities of the task or domain. Financial Services Use Cases Quantitative Strategy Simulation Simulate trading strategies and feed results into an LLM for interpretation or optimization. Stress Testing Financial Models Run Monte Carlo simulations or scenario analyses and use the outputs to inform LLMs about potential systemic risks. Vectors and the Semantics of Similarity Vector embeddings are closely related to the linear algebra of technical computing, but they bring semantic context to the table. Typically stored in so-called vector databases, they encode meaning into high-dimensional space, allowing AI to retrieve through search not just exact matches, but conceptual neighbors. They thus allow for multiple stochastically arranged answers, not just one. Until recently, vector embeddings and vector databases have been primary providers of enterprise context to LLMs, shoehorning all types of data as searchable mathematical vectors. Their downside is their brute force and compute-intensive approach to storing and searching data. That said, they use similar transfer learning approaches – and deep neural nets – to those that drive LLMs. As expensive, powerful brute force vehicles of Retrieval-Augmented Generation (RAG), vector databases don't simply just store documents but understand them, and have an increasingly proven place for enabling LLMs to ground their outputs in relevant, contextualized knowledge. Financial Services Use Cases Customer Support Automation Retrieve similar past queries, regulatory documents, or product FAQs to inform LLM responses in real-time. Fraud Pattern Matching Embed transaction descriptions and retrieve similar fraud cases to help the model assess risk or flag suspicious behavior. Time-Series, Temporal and Streaming Context Time-series database and analytics providers, and in-memory and columnar databases that can organize their data structures by time, specialize in knowing about the when. They can ensure temporal context—the heartbeat of many use cases in financial markets as well as IoT, and edge computing- grounds AI at the right time with time-denominated sequential accuracy. Streaming systems, like Kafka, Flink, et al can also facilitate the real-time central nervous systems of financial event-based systems. It's not just about having access to time-stamped data, but analyzing it in motion, enabling AI to detect patterns, anomalies, and causality, as close as possible to real time. In context engineering, this is gold. Whether it's fraud that happens in milliseconds or sensor data populating insurance telematics, temporal granularity can be the difference between insight and noise, with context stored and delivered by what some might see as a data timehouse. Financial Services Use Cases Market Anomaly Detection Injecting real-time price, volume, and volatility data into an LLM's context allows it to detect and explain unusual market behavior. High-Frequency Trading Insights Feed LLMs with microsecond-level trade data to analyze execution quality or latency arbitrage. Graphs That Know Who's Who Graph and relationship-focussed providers play a powerful role in context engineering by structuring and surfacing relationships between entities that are otherwise hidden in raw data. In the context of large language models (LLMs), graph platforms can dynamically populate the model's context window with relevant, interconnected knowledge—such as relationships between people, organizations, events, or transactions. They enable the model to reason more effectively, disambiguate entities, and generate responses that are grounded in a rich, structured understanding of the domain. Graphs can act as a contextual memory layer through GraphRAG and Contextual RAG, ensuring that the LLM operates with awareness of the most relevant and trustworthy information. For example, graph databases - or other environments, e.g. Spark, that can store graph data types as accessible files, e.g. Parquet, HDFS - can be used to retrieve a subgraph of relevant nodes and edges based on a user query, which can then be serialized into natural language or structured prompts for the LLM. Platforms that focus graph context around entity resolution and contextual decision intelligence can enrich the model's context with high-confidence, real-world connections—especially useful in domains like fraud detection, anti-money laundering, or customer intelligence. Think of them as like Shakespeare's Comedy of Errors meets Netflix's Department Q. Two Antipholuses and two Dromios rather than 1 of each in Comedy of Errors? Only 1 Jennings brother to investigate in Department Q's case, and where does Kelly MacDonald fit into anything? Entity resolution and graph context can help resolve and connect them in a way that more standard data repositories and analytics tools struggle with. LLMs cannot function without correct and contingent knowledge of people, places, things and the relationships between them, though to be sure many types of AI can also help discover the connections and resolve entities in the first place. Financial Services Use Cases AML and KYC Investigations Surface hidden connections between accounts, transactions, and entities to inform LLMs during risk assessments. Credit Risk Analysis Use relationship graphs to understand borrower affiliations, guarantors, and exposure networks. Seeing the World in Geospatial Layers Geospatial platforms support context engineering by embedding spatial awareness into AI systems, enabling them to reason about location, proximity, movement, and environmental context. They can provide rich, structured data layers (e.g., terrain, infrastructure, demographics, weather) that can be dynamically retrieved and injected into an LLM's context window. This allows the model to generate responses that are not only linguistically coherent but also geographically grounded. For example, in disaster response, a geospatial platform can provide real-time satellite imagery, flood zones, and population density maps. This data can be translated into structured prompts or visual inputs for an AI model tasked with coordinating relief efforts or summarizing risk. Similarly, in urban planning or logistics, geospatial context helps the model understand constraints like traffic patterns, zoning laws, or accessibility. In essence, geospatial platforms act as a spatial memory layer, enriching the model's understanding of the physical world and enabling more accurate, context-aware decision-making. Financial Services Use Cases Branch Network Optimization Combine demographic, economic, and competitor data to help LLMs recommend new branch locations. Climate Risk Assessment Integrate flood zones, wildfire risk, or urban heat maps to evaluate the environmental exposure of mortgage and insurance portfolios. Context Engineering Beyond the Limits of Data, Knowledge & Truths Context engineering I believe recognizes that data is partial, and that knowledge and perhaps truth or truths needs to be situated, connected, and interpreted. Whether through graphs, time-series, vectors, tech computing platforms, or geospatial layering, AI depends on weaving the right contextual strands together. Where AI represents the loom, the five types of platforms I describe are like the spindles, needles, and dyes drawing on their respective contextual fabrics of ever changing data, driving threads of knowledge—contingent, contextual, and ready for action.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store