logo
#

Latest news with #JohnMaynardKeynes

The future of money: Kashyap Kompella on what's next for this pivotal invention
The future of money: Kashyap Kompella on what's next for this pivotal invention

Hindustan Times

time5 days ago

  • Business
  • Hindustan Times

The future of money: Kashyap Kompella on what's next for this pivotal invention

What is money? Economists begin with function. When code replaces coin, we are no longer redesigning currency. We are redesigning control, compliance and consequence. (HT Illustration: Puneet Kumar via Midjourney) Money, they say, does three things: it lets us exchange goods and services, acts as a standard measure of value, and it lets us store value. A kind of economic Swiss Army knife, remarkably adaptable and endlessly circulated. John Maynard Keynes saw money as a bridge between present and future. Milton Friedman warned that it could be a weapon in the wrong hands; that too much of it, too fast, would corrode a system. Hyman Minsky went deeper still: all money is a promise, he said, but not all promises are equal. Some come wrapped in the authority of the state, others in the credibility of a bank, still others in the brute fact of power. Today, a hundred-rupee note doesn't in fact represent a hundred rupees. It asks to be believed as such. What we call currency is a fiction wrapped in design: Microtext and hologram, watermark and thread, security and ceremony. We dress our illusions well. Where coins offered a kind of weight and direct value, and the early notes were backed by metal (often gold), stored somewhere, safe and tangible, most currency is now backed by the heft of its respective state. By inertia as well, in a sense. But really, in a world where money is mostly numbers drifting across invisible networks, what holds it up is our collective agreement. Consensus as collateral. Money is the most powerful fiction humans ever agreed to believe. (Scroll to the bottom for more on how this works, and how we got here.) The death of cash Is it accident that most money now doesn't even exist as paper? What does it mean that so much of the ritual and choreography around this asset is fading? There was a time when one went to the bank to update a passbook, and to an ATM to withdraw the notes. Money still had a place, a shape, a texture. More and more, today, it doesn't. Cash, we are told, is the past. It is eulogised in policy memos and start-up decks. Replaced by cleaner, smarter tools; contactless, compliant, optimistically frictionless. It persists, in temple donation boxes, in wedding envelopes, in the shadowy portions of real-estate payments. It lives in private safes, in the seams of sari blouses and under mattresses. It survives in the folds between trust and traceability. It neither asks its user's name nor logs their location. It holds no record of where it has been. But for how long? Secret gardens In the mythology of Silicon Valley, every system is just a feature waiting to be rebuilt: as faster, more frictionless, more easily monetisable. Money is no different. Here, the goal has become focused on erecting walled gardens to enclose wealth and spending. Capture the interface, own the flow. Build platforms that draw money in, and then design ways to keep it there. It's the Starbucks Rewards scheme, on a global scale. Already, expanding ecosystems of this kind have been built by Google, Apple, Amazon and others. For the user, the promise is a smooth, unified experience (and small benefits for staying with the walled garden). For the tech company, the potential is massive. If a salary is eventually disbursed onto a platform and spent within that platform, in tokens or credits or points, does it matter (to the owner of the platform) whose name is on each token as it changes hands? The real revolution, however, will be driven by those who have been carefully watching. Early experiments in these walled gardens have shown governments — ie, the powers that create, regulate and oversee the money actually driving it all — what is possible and what will be embraced. Which brings us to… Money with a mind Imagine a coin that knows what it's for. A welfare payment that refuses to be spent on alcohol. A currency that reports to headquarters, quietly, after every transaction. This isn't science-fiction. These are the traits or potential traits of central bank digital currencies. These are being rolled out as test cases around the world, in countries ranging from China and Nigeria to Jamaica and India. This is programmable money; essentially, money with a mind. If it becomes widespread, for the first time in history, standardised promissory notes will no longer be silent, disinterested participants in a transaction. Governments could target subsidies more precisely and monitor corruption in real time. They could also automate compliance. Economic policy could be deployed like code: live, granular, conditional. The implications, of course, are enormous. This kind of currency could serve as a direct tool of control. The lines between incentive and instruction, governance and surveillance, public and private, could blur. Programmable money would turn spending into a performance that is constantly logged and evaluated. Unlike cash, this money could also be remotely controlled. In one possible scenario, a dissident isn't placed under house arrest; their credits are simply erased. And compliance becomes a hushed imperative. Parallel tracks In a strange twist, cryptocurrency — born of rebellion against the absurdities of hyper-capitalist definitions of money, and the excessive control wielded by governments through it — laid the groundwork for money with a mind. Bitcoin, the world's first such currency, was born in 2008, in the wake of the global financial crisis. If so much of the world's money was fiction to begin with, and could simply evaporate because it had no true inherent value, then why could a new kind of currency not improve on this with ideas of its own (such as limited supply and far greater transparency), it argued. Bitcoin began to be 'mined' in 2009, generated as a fee or reward for using powerful computers to solve complex math problems. But hyper-capitalism claimed this revolution too. As it gained in value, it lost its claim to rebellion. Hype made it speculative and volatility made it impractical. What had been pitched as the people's money became one more asset class. As more cryptocurrencies emerged, creating, securing, and transferring value without state intervention or control, the empire took note. Governments began planning centralised digital currencies. (In a final signal that this particular revolution has been co-opted, American government agencies are now considering using Bitcoin-backed instruments to shore up and diversify pension-fund portfolios.) The future of us For years, the future of money has been framed as a contest of forms. Would cash survive? Would the dollar be dethroned? Would we eventually pay via thumbprints, retinal scans, barcodes embedded in skin? These are interesting questions, but not the most important ones. Because the deeper shift isn't about form. It is about access. In a world where money is programmable, traceable, and conditional, the critical questions will be: Who decides how it is used? And: Who will be watching, each time you swipe? Now, the old order had problems. It leaked, it excluded, it corroded and enabled hoarding, laundering and loopholes. The new order seeks to fix some of this. But what it fixes, it also redefines. We began by asking what money is. We end with something harder: How will it change us this time? Because money is never just money. It is infrastructure for belief. It is how a society encodes obligation. How it decides what counts, who counts, and on what terms value can be held, moved, withheld and erased. When code replaces the coin, when your account becomes your identity, we are no longer redesigning currency. We are redesigning control, compliance and consequence. Yes, cash may survive in the cracks. The dollar may hold its seat a while longer. Cryptocurrencies may mature. But these are surface questions. The deeper shift is this: money is fusing with code, and code is never a silent participant. What we are building is not just a new financial system. It is a new moral architecture. One where every choice — by government, by company and by user — carries the weight of a rule once debated in public. The future of money is not a question of coins vs notes vs ledgers. It is the future of trust. The future of access. The future of power. Which is to say: the future of us. (Kashyap Kompella is an industry analyst and author of two books on AI) . A TIMELINE: How our money came to be Coins have a long history that overlaps with ideas of barter, soft power and annexation. So, the story of money for money's sake really dates to the earliest forms of non-metal currency: standardised promissory notes. * 118 BCE: The Chinese empire takes its first steps towards lighter, more representative money by issuing tokens or promissory notes on leather. * 1000 CE: In Sichuan, as trade booms, strings of coins are becoming too bulky to haul around, so black-and-red mulberry-paper receipts begin to be used instead. Sixteen merchants are awarded the right to issue these, and the government ultimately takes over, issuing the world's first fixed-denomination banknotes. They are essentially backed by bullion; a trader needed to hand over strings of coins and take an equivalent note in return. These notes could then circulate until someone returned to the merchant-banker to claim the corresponding coins. Of course, soon enough the notes are doing the rounds without the coins themselves being moved at all — and money was born. A banknote dated 1287, with its printing wood plate, from Yuan dynasty China. (Wikimedia) * 1200s: Central and western Asia took to the concept readily. Fast-forward 200 years and the Mongol emperor Kublai Khan has helped spread paper money all the way to Persia. But the concept baffles Europe. Those reading about paper money in Marco Polo's travels think it so preposterous, they wonder if he's making it up. * 1294: The Persian city of Tabriz experiments with paper money of its own but issues too much of it, sending the trading port of Basra into financial ruin. * 1455: The Chinese goof up too. Their over-production of paper notes devalues their money. Paper money is eliminated at this point, and will not return for centuries. Currency reverts to metal. A treasure note from the Qing dynasty (1644-1912), China. (Wikimedia) * 1661: Dutch entrepreneur Johan Palmstruch, who founded the Stockholms Banco in collaboration with the Swedish government, introduces kreditivsedlar or credit notes. They come in set denominations, are watermarked, bear a date of issue, bank seal and eight banker signatures. They are a hit. But they issue too many too and the bank is liquidated. A 1666 banknote for 100 Swedish daler, issued by Stockholms Banco, signed by founder Johan Palmstruch. (Wikimedia) * 1694: England learns from Sweden and sets up the Bank of England to issue Pound Sterling notes to help fund a war with France. * 1700s: Banks are appearing across the colonies. Currency notes are circulated within banking regions. For the public, it is a convenient and safe way to move money around. For the banks, it creates wealth from thin air – banks are permitted to print as much as 1/3rd more notes than they have coins in reserve. * 1792: Following the end of the American War of Independence in 1783, the US dollar is declared the country's official currency. * 1700s to 1900s: Given how much of the world Great Britain controls, it is no surprise that the Pound Sterling is the default global currency. * 1944: World War 2 is devastating Britain. Its empire is shrinking at the same time. The US, meanwhile, is now the world's most stable economy. Amid acknowledgement that the Pound Sterling will need to be replaced as the world's reserve currency, 44 allied countries come together to sign the Bretton Woods agreement. It fixes a rate of exchange for all foreign currencies against the US dollar, with the US promising, in theory, to back every dollar transaction using its vast reserves of gold. (Interestingly, it has shored up much of this gold as a result of trade surpluses during the two world wars.) The gold-backed dollar remains relatively stable, allowing other countries to back their currencies with dollars rather than gold. In order to back a currency with dollars, of course, one must have dollars. This creates an entirely separate revenue stream for the United States, turning US treasury bonds into one of the most powerful debt instruments in the world. Governments buy the bonds on the promise that the US can swap them for dollars at any time. They then use the bonds (plus actual dollar reserves) to keep their economies stable. * 1971: US President Richard Nixon delinks the US dollar's representative value from the country's gold reserves. This is essentially a tacit admission that the economy has grown so large that there isn't enough gold in the world to back it with. Money as a social construct has entered a new phase. What does back the dollar now? Trust and goodwill, partly. As well as the understanding that US economy can generate enough revenue (through the direct sale of goods and services, and through taxes and debt) to keep the still-growing system from imploding. But perhaps the most powerful thing keeping currencies today from crumbling is the quiet social contract by which we all agree not to look directly at the numbers, so as not to see them for the mirage that they, sort of, are. Instead, our system is backed by the idea that as long as the wheel keeps turning, the wheel will keep turning. . WHEELS WITHIN WHEELS: How much of our money is 'real'? What does it really mean that most currency is no longer backed by gold? That isn't even really the question. The real question is: How much of our money is 'real'? And the answer is: It is impossible to say. For instance, let's say that you put ₹1 lakh in the bank. The bank uses it to issue a loan to a customer. That money is now in two places at once. The customer who took the loan uses it to buy things; the person he pays uses it again. The money is now in multiple places at once. And that's not even accounting for how much of the original ₹1 lakh was 'yours' to begin with. Anti-capitalists view this system as absurd, and it was partly as a mark of protest against this absurdity that Bitcoin was born. It was marketed as a fresh slate; anyone could get rich; there was no legacy wealth. Could it become the money of the future? It turned out, of course, to be simply another social construct, entangled with those that came before it: money, legislation, security, adoption, legitimacy. The idea of cryptocurrency has since become woven into the idea of centralised money. There has been periodic talk of post-money economies replacing the tangles of today. It is wholly unclear what they would look like, or what kind of world we would need to build in order for them to work. For the moment, then, money remains the most stable means of exchange, if not the most just or logical. Just as elections remain the most stable means of governing large populations. Could a change be coming? It almost certainly is. The story of money, of societies, of people, after all, is just an endless unfolding of old to new.

America's broken politics affecting economy
America's broken politics affecting economy

Gulf Today

time07-07-2025

  • Business
  • Gulf Today

America's broken politics affecting economy

The political realignment has come for economics. At least since the days of Friedrich Hayek and John Maynard Keynes in the last century, the divide in economic thinking roughly corresponded to the political split. In the mainstream, everyone was a capitalist and saw some role for government. The right/left divide was mostly over exactly how big that role should be. Now, in economics as in politics, it is no longer left versus right; it is moderates versus populists. The question isn't so much the optimal size of government in a global market-based economy, it is whether the economy is positive or zero-sum and how it entrenches power, according to Tribune News Service. The result is unlikely allies and enemies. The horseshoe theory of politics holds that extreme left and right partisans agree more with each other than they do with the centrists in their party. That theory now also applies to economics. A decade and a half ago, economists and policy wonks were divided on things that in retrospect seem quite small — the structure of the Affordable Care Act, for example. More and more lately, I struggle to find disagreement with center-left economics pundits who used to make me shake my head. It could be that we are all moderating with age. But I don't think so. It's that the conversation has changed. The debate is increasingly about questions we moderates have long seen as resolved, such as whether price controls work (no), globalization is a good thing (yes), or growth should be the primary objective (of course). These questions are being revisited because populists have become a much bigger and more influential force in US politics and policy — and as they do, centrists find that we have more in common with each other than the more extreme wings of our respective camps. It's not just me. Ezra Klein recently described a divide in the Democratic Party over the so-called abundance agenda, which argues that getting many regulations and special-interest groups out of the way can unlock more growth. So-called 'abundance liberals' argue that, with the right policies, the government can increase economic growth and make everyone better off. The more populist wing of the Democratic Party rejects this approach, because it sees the real problem as power. It has a more zero-sum view of the economy, in which the powerful (usually corporations and the rich) take most of the limited resources everyone should be entitled to. I am closer to abundance liberals (let's make a bigger economic pie) than I am to populist liberals (let's make sure the pie slices are exactly even). I also support getting rid of wasteful regulations and favors to special-interest groups. The difference is that I think these barriers need to be removed to empower the private sector, not the government, to drive growth. This is not a trivial difference, and someday it will probably tear our fragile alliance apart. But for now, compared to the alternative, it feels semantic. Conservatives are facing a divide similar to the one Klein describes among liberals. The populist strain of the right also sees the world as zero-sum and condemns the concentration of power — not of the rich, but among foreigners and institutions: universities, technology firms, government bureaucracies, international agencies, and so on. President Donald Trump's administration reflects this division. Its economic team includes representatives from the more traditional pro-growth wing of the Republican Party, with trained economists and people who worked in finance, as well as people from the more populist zero-sum wing, dominated by Yale Law graduates and their fellow travelers. This realignment will shape America's economic discourse and policies for the foreseeable future. Rather than a right/left divide on the role of government, the main debate going forward will be between centrists and populists.

America's Broken Politics Is Breaking Economics, Too
America's Broken Politics Is Breaking Economics, Too

Bloomberg

time01-07-2025

  • Business
  • Bloomberg

America's Broken Politics Is Breaking Economics, Too

The political realignment has come for economics. At least since the days of Friedrich Hayek and John Maynard Keynes in the last century, the divide in economic thinking roughly corresponded political split. In the mainstream, everyone was a capitalist and saw some role for government. The right/left divide was mostly over exactly how big that role should be. Now, in economics as in politics, it is no longer left versus right; it is moderates versus populists. The question isn't so much the optimal size of government in a global market-based economy, it is whether it the economy is positive or zero-sum and how it entrenches power.

A jobless future? Rethinking work, worth, and what young graduates must do
A jobless future? Rethinking work, worth, and what young graduates must do

New Indian Express

time26-06-2025

  • Business
  • New Indian Express

A jobless future? Rethinking work, worth, and what young graduates must do

In 1930, John Maynard Keynes prophesied that within a century, technology would advance so rapidly that we'd work just fifteen hours a week. The rest, he said, would be 'leisure filled with wisdom.' It is 2025 now. We have the technology; indeed, machines now write code, generate poetry, diagnose illnesses, and trade stocks. But Keynes' utopia has curdled into something else entirely; underemployment, precarity, and a deepening crisis of human worth in an age where artificial intelligence (AI) increasingly renders human labour redundant. The real crisis isn't just economic; it is existential. The collapse of work as we knew it We live in a world where there are more job seekers than jobs. This is not just a cyclical issue of economic downturns. It's structural. AI has magnified this structural imbalance. It can do more, for less, and often better. As Kai-Fu Lee, the former President of Google China and author of AI Superpowers, argues, 'AI will replace 40% of the world's jobs within 15 years.' What's more alarming is that it's not just factory workers or clerks, it's writers, lawyers, teachers, designers, and even software engineers. When one person with the help of AI can do the work of ten, what happens to the nine? A strange silence fills the space where policy should speak. Governments tinker with skilling programmes, universities revise syllabi, but none address the fundamental dislocation underway. And the young, especially fresh graduates, find themselves caught in a world they were not prepared for. Degrees without direction For decades, education has been sold as a passport to prosperity. Get a degree, any degree and you'll find a job. But this promise has frayed. In India, nearly 42% of graduates under 25 are unemployed (CMIE, 2024). The situation is not much better in many parts of the developed world. Even in the United States, the so-called land of innovation, graduates struggle with underemployment, debt, and gig-economy drudgery. It's not just about supply-demand mismatch. It's a value mismatch. Most formal education continues to reward repetition, compliance, and memory, all the things machines now do better. Creativity, synthesis, moral judgment, and emotional intelligence, the truly human capacities, remain undernourished. So, what should a young graduate do? Learning how to learn and unlearn The first and most urgent shift required is psychological. No job is for life anymore. The linear career path, education, job, promotion, retirement is dead. In its place is the zigzag of projects, reinventions, collaborations, failures, and perhaps, something deeper 'vocation'. Graduates must understand that 'learning to learn' is the only skill with lasting value. As Yuval Noah Harari, author of 21 Lessons for the 21st Century, points out, 'In a world flooded with irrelevant information, clarity is power.' This clarity comes not from memorizing facts, but from knowing how to think, critically, ethically, and contextually. The young must cultivate the courage to unlearn: to discard stale notions of prestige and 'safe' careers, and instead explore what problems are worth solving, not just what skills are worth selling. Build for the human future, not the machine's Technology isn't destiny. It reflects values. AI is not some cosmic force; it is built by people, trained on data, and shaped by incentives. The real question is: What kind of society are we building with AI? Tristan Harris, the former Google ethicist who now runs the Centre for Humane Technology, warns against 'technology that hacks human weaknesses.' He calls for a renaissance of humane design, technologies that augment human agency rather than automate human obsolescence. Graduates from every stream, whether arts, sciences, or commerce, must ask: What is the human role in a machine world? The answers won't come from textbooks but from interdisciplinary exploration. The philosopher Martha Nussbaum, for instance, argues that a liberal arts education is more essential than ever not to churn out 'job-ready' employees but citizens capable of compassion, curiosity, and democratic judgment. Think local, act planetary The AI boom has exposed another stark world is connected, but the gains are not. Most AI tools are built for urban, Western contexts. But the crises of hunger, health, education, and climate disproportionately affect the Global South. Young graduates, especially in India, must think of their work as service, not survival alone. The future is not in chasing scarce jobs, but in creating new forms of value rooted in local needs. This is where Gandhi becomes relevant again, not as a figure of nostalgia but as a thinker of radical modernity. His idea of Nai Talim (basic education) was to blend learning with livelihood, knowledge with character. Imagine a generation of graduates who build AI tools for farmers, who design educational games in regional languages, who run micro-health networks in remote villages. These are not 'jobs' in the conventional sense but works of purpose. Community, not just individual genius We are sold the myth of the solitary tech wizard who changes the world with a startup. But meaningful innovation rarely happens in isolation. It emerges in communities of trust, where different minds bring different strengths. Young graduates must therefore invest in relationships, collaborations, and networks of mutual aid. In an age of hyper-individualism, the future will belong to those who can build teams, share credit, and solve problems collectively. The corporate economy extracts; the living economy regenerates. The young must turn their gaze from extractive metrics, salaries, designations, packages, to regenerative are we restoring, healing, building? These questions matter more than ever in a time when work must serve not just profit, but purpose. The future will not be built by those who merely seek to compete with machines, but by those who can reimagine what it means to be human in a world remade by them. The work of being human Let us be clear. AI will continue to advance. It will outstrip us in efficiency, consistency, and speed. But it will never replace meaning, beauty, empathy, or love, the things that make life worth living. A young graduate, therefore, must not ask 'What job can I get?' but 'What human role can I inhabit that AI never can?' This is not the end of work. It is the beginning of a new imagination of work, not as a market commodity, but as an act of creation, care, and contribution. That, Keynes might agree, is the real future worth building. (The author is a Bangalore-based management professional, literary critic, translator, and curator)

From vaults to verdicts: Central banks say it's still gold's time to shine
From vaults to verdicts: Central banks say it's still gold's time to shine

Economic Times

time19-06-2025

  • Business
  • Economic Times

From vaults to verdicts: Central banks say it's still gold's time to shine

For years, the so-called financial wizards laughed at the Indian housewife's fetish for gold. Economist John Maynard Keynes was no exception calling the yellow metal a 'barbaric relic'. But the prudent Indian housewife showed the mirror to the financial veterans. Not only did the institutional investors have a new-found love for gold in recent years, the lords of the financial systems, those who print currencies that fill every wallet – the

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store