logo
What is clubbing provision in Income Tax Act? When is it applicable?

What is clubbing provision in Income Tax Act? When is it applicable?

Time of India7 days ago
What is clubbing of income?
Clubbing provisions
When is clubbing not applicable?
If money or assets are transferred to wife or daughterin-law before marriage.
If any gifts are received at the time of marriage. The income from these are also not clubbed for the transferor.
If one stays with parents and gives them rent, or offers a monetary gift, or if they invest this amount, it will not invite clubbing provisions.
If money is invested in the PPF for spouse or child.
If money is saved by wife from the household expense funds given by the husband.
As the term suggests, clubbing means combining the income of another person with one's own for tax calculation purposes. However, many people club the incomes of their family members and close relatives in order to evade tax or minimise their tax liabilities. To prevent tax evasion and encourage compliance, the government has laid down rules for clubbing of incomes under Sections 60-64 of the Income Tax Act , 1961.These rules, or clubbing provisions, specify the types of incomes, relationships and circumstances under which incomes can be clubbed for tax benefits. So, not all relationships or money transfers qualify for clubbing provisions. Spouses, children, daughters-inlaw, Hindu Undivided Families, and specific close relatives typically invite these provisions. For instance, if a parent invests in a fixed deposit (FD) in his child's name, clubbing provisions apply and the interest generated from this FD is clubbed with the income of the parent who earns more. However, the parent can avail of `1,500 per child tax deduction under the old tax regime . Besides, clubbing provisions won't apply if a minor child earns an income through manual work, or by using his own skill and talent, or if he suffers from a disability.The incomes that are considered for clubbing can be from various sources and investment avenues like property, interest, mutual funds and bank deposits, among others. Since the taxable income increases after clubbing, it also impacts the ITR form that is chosen to file tax returns.Clubbing provisions among relatives and family members don't apply under these specific conditions.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Financial Planning: 1 Crore May Be Not Enough For A Comfortable Life Post Retirement
Financial Planning: 1 Crore May Be Not Enough For A Comfortable Life Post Retirement

NDTV

time7 hours ago

  • NDTV

Financial Planning: 1 Crore May Be Not Enough For A Comfortable Life Post Retirement

People often ask how much savings is enough for a comfortable life after retirement in India. It has been a general assumption that Rs 1 crore is enough to make ends meet after retiring from the job. But because of inflation, the situation has changed. Rs 1 crore today will significantly lose its value over time. For instance, with a 6 per cent inflation rate, Rs 1 crore would be worth only Rs 55.84 lakh in 10 years and Rs 31.18 lakh in 20 years. To combat inflation, it's essential to invest in assets that yield returns higher than the inflation rate, such as stocks, real estate or bonds. Early planning and diversified investments are crucial to ensure financial security in the long run, NDTV Profit reported. Once it was, but now Rs 1 crore is not the amount to set as an ultimate goal, it's just a pit stop. According to the India Retirement Index Study (IRIS) by Max Life Insurance (IRIS 4.0), only 44% of respondents believe retirement planning should start before the age of 35, highlighting the need for greater awareness and planning. To build a corpus of Rs 1 crore or more, people consider a monthly SIP. For example, investing Rs 1,15,000 monthly for five years could help accumulate Rs 1 crore, assuming a certain rate of return. PPF investment will also help, as if someone invests Rs 1.5 lakh annually in a Public Provident Fund (PPF) for 25 years, it could yield over Rs 1 crore, given the current interest rate of 7.1%. Another crucial benefit of PPF is that it is a long-term debt investment instrument and comes under the Exempt-Exempt-Exempt (EEE) category, which means investments are tax-exempt. The interest earned and withdrawals would also be tax-free. PPF also offers a guaranteed tax-free return.

"ChatGPT is not a diary, therapist, lawyer, or friend": LinkedIn user warns against oversharing everything with AI
"ChatGPT is not a diary, therapist, lawyer, or friend": LinkedIn user warns against oversharing everything with AI

Economic Times

time10 hours ago

  • Economic Times

"ChatGPT is not a diary, therapist, lawyer, or friend": LinkedIn user warns against oversharing everything with AI

ChatGPT users are being warned to think twice before typing anything personal into the chatbot. OpenAI CEO Sam Altman recently confirmed that interactions with ChatGPT aren't protected by confidentiality laws. Conversations you assume are private may be stored, reviewed, and even presented in court — no matter how sensitive, emotional or casual they seem.'If you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up,' Altman said in an interview on the This Past Weekend podcast. He added, 'We should have, like, the same concept of privacy for your conversations with AI that we do with a therapist or whatever.'But as of now, that legal framework doesn't explained, 'Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's confidentiality. We haven't figured that out yet for ChatGPT.'This sharp warning is echoed by Shreya Jaiswal, a Chartered Accountant and founder of Fawkes Solutions, who posted her concerns on LinkedIn. Her message was blunt and alarming. 'ChatGPT can land you in jail. No, seriously. Not even joking,' she to Jaiswal, Altman's own words spell out the legal dangers. 'Sam Altman – the CEO of OpenAI, literally said that anything you type into ChatGPT can be used as evidence in court. Not just now, even months or years later, if needed. There's no privacy, no protection, nothing, unlike talking to a real lawyer or therapist who is sworn to client confidentiality.'She laid out a few scenarios that, while hypothetical, are disturbingly someone types: 'I cheated on my partner and I feel guilty, is it me or the stars that are misaligned?' Jaiswal pointed out how this could resurface in a family court battle. 'Boom. You're in court 2 years later fighting an alimony or custody battle. That chat shows up. And your 'private guilt trip' just became public proof.' Even seemingly harmless curiosity can be risky. 'How do I save taxes using all the loopholes in the Income Tax Act?' or 'How can I use bank loans to become rich like Vijay Mallya?' could be interpreted as intent during a future audit or legal probe. 'During a tax audit or loan default, this could easily be used as evidence of intent even if you never actually did anything wrong,' she warned. In another example, she highlighted workplace risk. 'I'm thinking of quitting and starting my own company. How can I use my current company to learn for my startup?' This, she argued, could be used against you in a lawsuit for breach of contract or intellectual property theft. 'You don't even need to have done anything. The fact that you thought about it is enough.'Jaiswal expressed concern that people have become too casual, even intimate, with AI tools. 'We've all gotten way too comfortable with AI. People are treating ChatGPT like a diary. Like a best friend. Like a therapist. Like a co-founder.''But it's none of those. It's not on your side, it's not protecting you. And legally, it doesn't owe you anything.'She closed her post with a simple piece of advice: 'Let me make this simple – if you wouldn't say it in front of a judge, don't type it into ChatGPT.'And her final thought was one that many might relate to: 'I'm honestly scared. Not because I have used ChatGPT for something I shouldn't have. But because we've moved too fast, and asked too few questions, and continue to do so in the world of AI.'These concerns aren't just theory. In a 2024 bankruptcy case in the United States, a lawyer submitted a legal brief that cited fake court cases generated by ChatGPT. The judge imposed a fine of $5,500 and ordered the lawyer to attend an AI ethics session. — slow_developer (@slow_developer) Similar disciplinary actions were taken against lawyers in Utah and Alabama who relied on fabricated AI-generated incidents have underscored a critical truth: AI cannot replace verified legal research or professional advice. It can mislead, misrepresent, or completely fabricate information — what researchers call "AI hallucinations".Altman also flagged a worrying trend among younger users. Speaking at a Federal Reserve conference, he said, 'There are young people who say, 'I can't make any decision in my life without telling ChatGPT everything that's going on. It knows me. I'm going to do whatever it says.' That feels really bad to me.'He's concerned that blind faith in AI could be eroding people's ability to think critically. While ChatGPT is programmed to provide helpful answers, Altman stressed it lacks context, responsibility, and real emotional advice is straightforward, and it applies to everyone: Don't use ChatGPT to confess anything sensitive, illegal or personal Never treat it as a lawyer, therapist, or financial advisor Verify any factual claims independently Use AI to brainstorm, not to confess And most importantly, don't say anything to a chatbot that you wouldn't be comfortable seeing in court While OpenAI claims that user chats are reviewed for safety and model training, Altman admitted that conversations may be retained if required by law. Even if you delete a conversation, legal demands can override those actions. With ongoing lawsuits, including one from The New York Times, OpenAI may soon have to store conversations indefinitely. For those looking for more privacy, Altman suggested considering open-source models that can run offline, like GPT4All by Nomic AI or Ollama. But he stressed that what's needed most is a clear legal framework.'I think we will certainly need a legal or a policy framework for AI,' he then, treat your chats with caution. Because what you type could follow you — even years later.

"ChatGPT is not a diary, therapist, lawyer, or friend": LinkedIn user warns against oversharing everything with AI
"ChatGPT is not a diary, therapist, lawyer, or friend": LinkedIn user warns against oversharing everything with AI

Time of India

time10 hours ago

  • Time of India

"ChatGPT is not a diary, therapist, lawyer, or friend": LinkedIn user warns against oversharing everything with AI

ChatGPT users are being warned to think twice before typing anything personal into the chatbot. OpenAI CEO Sam Altman recently confirmed that interactions with ChatGPT aren't protected by confidentiality laws. Conversations you assume are private may be stored, reviewed, and even presented in court — no matter how sensitive, emotional or casual they seem. 'If you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up,' Altman said in an interview on the This Past Weekend podcast. He added, 'We should have, like, the same concept of privacy for your conversations with AI that we do with a therapist or whatever.' Explore courses from Top Institutes in Please select course: Select a Course Category Design Thinking others PGDM Data Analytics Cybersecurity Operations Management healthcare Public Policy Others Digital Marketing Management MBA CXO Leadership Degree MCA Product Management Artificial Intelligence Finance Technology Data Science Project Management Healthcare Data Science Skills you'll gain: Duration: 22 Weeks IIM Indore CERT-IIMI DTAI Async India Starts on undefined Get Details Skills you'll gain: Duration: 25 Weeks IIM Kozhikode CERT-IIMK PCP DTIM Async India Starts on undefined Get Details But as of now, that legal framework doesn't exist. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Crossout: New Apocalyptic MMO Crossout Play Now Undo Altman explained, 'Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's confidentiality. We haven't figured that out yet for ChatGPT.' 'ChatGPT can land you in jail' This sharp warning is echoed by Shreya Jaiswal , a Chartered Accountant and founder of Fawkes Solutions, who posted her concerns on LinkedIn. Her message was blunt and alarming. Live Events 'ChatGPT can land you in jail. No, seriously. Not even joking,' she wrote. According to Jaiswal, Altman's own words spell out the legal dangers. 'Sam Altman – the CEO of OpenAI, literally said that anything you type into ChatGPT can be used as evidence in court. Not just now, even months or years later, if needed. There's no privacy, no protection, nothing, unlike talking to a real lawyer or therapist who is sworn to client confidentiality.' She laid out a few scenarios that, while hypothetical, are disturbingly someone types: 'I cheated on my partner and I feel guilty, is it me or the stars that are misaligned?' Jaiswal pointed out how this could resurface in a family court battle. 'Boom. You're in court 2 years later fighting an alimony or custody battle. That chat shows up. And your 'private guilt trip' just became public proof.' Even seemingly harmless curiosity can be risky. 'How do I save taxes using all the loopholes in the Income Tax Act ?' or 'How can I use bank loans to become rich like Vijay Mallya ?' could be interpreted as intent during a future audit or legal probe. 'During a tax audit or loan default, this could easily be used as evidence of intent even if you never actually did anything wrong,' she warned. In another example, she highlighted workplace risk. 'I'm thinking of quitting and starting my own company. How can I use my current company to learn for my startup?' This, she argued, could be used against you in a lawsuit for breach of contract or intellectual property theft. 'You don't even need to have done anything. The fact that you thought about it is enough.' Treating ChatGPT like a therapist? Think again Jaiswal expressed concern that people have become too casual, even intimate, with AI tools. 'We've all gotten way too comfortable with AI. People are treating ChatGPT like a diary. Like a best friend. Like a therapist. Like a co-founder.' 'But it's none of those. It's not on your side, it's not protecting you. And legally, it doesn't owe you anything.' She closed her post with a simple piece of advice: 'Let me make this simple – if you wouldn't say it in front of a judge, don't type it into ChatGPT.' And her final thought was one that many might relate to: 'I'm honestly scared. Not because I have used ChatGPT for something I shouldn't have. But because we've moved too fast, and asked too few questions, and continue to do so in the world of AI.' Real cases, real fines These concerns aren't just theory. In a 2024 bankruptcy case in the United States , a lawyer submitted a legal brief that cited fake court cases generated by ChatGPT. The judge imposed a fine of $5,500 and ordered the lawyer to attend an AI ethics session. — slow_developer (@slow_developer) Similar disciplinary actions were taken against lawyers in Utah and Alabama who relied on fabricated AI-generated citations. These incidents have underscored a critical truth: AI cannot replace verified legal research or professional advice. It can mislead, misrepresent, or completely fabricate information — what researchers call "AI hallucinations". Younger users are relying too much on ChatGPT Altman also flagged a worrying trend among younger users. Speaking at a Federal Reserve conference, he said, 'There are young people who say, 'I can't make any decision in my life without telling ChatGPT everything that's going on. It knows me. I'm going to do whatever it says.' That feels really bad to me.' He's concerned that blind faith in AI could be eroding people's ability to think critically. While ChatGPT is programmed to provide helpful answers, Altman stressed it lacks context, responsibility, and real emotional understanding. What you should do The advice is straightforward, and it applies to everyone: Don't use ChatGPT to confess anything sensitive, illegal or personal Never treat it as a lawyer, therapist, or financial advisor Verify any factual claims independently Use AI to brainstorm, not to confess And most importantly, don't say anything to a chatbot that you wouldn't be comfortable seeing in court While OpenAI claims that user chats are reviewed for safety and model training, Altman admitted that conversations may be retained if required by law. Even if you delete a conversation, legal demands can override those actions. With ongoing lawsuits, including one from The New York Times, OpenAI may soon have to store conversations indefinitely. For those looking for more privacy, Altman suggested considering open-source models that can run offline, like GPT4All by Nomic AI or Ollama. But he stressed that what's needed most is a clear legal framework. 'I think we will certainly need a legal or a policy framework for AI,' he said. Until then, treat your chats with caution. Because what you type could follow you — even years later.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store