World
Conservative MP shares inaccurate, ChatGPT-generated stats on capital gains tax rate | CBC News
An Ontario Conservative MP’s use of ChatGPT to share incorrect information online about Canada’s capital gains tax rate offers a cautionary tale to politicians looking to use AI to generate messages, one expert says.
MP Ryan Williams posted last week on X (formerly known as Twitter) an AI-generated ranking of G7 countries and their capital gains tax rates.
The list appeared to have been generated by ChatGPT — an artificial intelligence-based virtual assistant — and falsely listed Canada’s capital gains tax rate as 66.7 per cent. The ChatGPT logo was shown in the screenshot Williams posted. The post has since been deleted.
The Liberal government’s increase to the capital gains inclusion rate — the amount of capital gains that are considered taxable income — has become a new point of attack for the Conservatives since the party voted against the change earlier this month. Conservative Leader Pierre Poilievre posted a nearly 16-minute online video about the change.
But the capital gains inclusion rate and tax rate are two different things. The inclusion rate is the portion of the capital gain subject to tax; capital gains tax rates themselves vary depending on income.
A capital gain is the difference between an asset’s cost and its total sale price. That asset could be a cottage, an investment property, a stock or a mutual fund. In Canada, primary residences are not included under the capital gains tax.
The changes that take effect Tuesday will raise the inclusion rate for individuals from 50 per cent to 66.7 per cent for capital gains above $250,000. For the first $250,000 in capital gains, only $125,000 is taxable. Two-thirds of every dollar beyond $250,000 will be taxable. Federal income tax rates are then applied to those amounts.
For corporations, there will be no $250,000 threshold. Two-thirds of all capital gains earned by corporations and trusts will be taxable.
ChatGPT, a powerful chatbot that can generate information instantly and carry on human-like conversations, has revolutionized the way people use AI.
Users also have found that it makes factual mistakes and even struggles with simple math problems.
The capital gains tax rates for other G7 countries also appear to be slightly off in William’s post.
It lists France’s rate as 34 per cent. That country’s rate is 30 per cent, with an additional four per cent for higher income earners, according to PricewaterhouseCoopers. William’s post also lists the U.S. rate as 23.8 per cent, but it’s actually 20 per cent federally.
CBC News reached out to the Conservative MP for comment but did not receive a response.
Fenwick McKelvey, a professor of information and communication technology at Concordia University, said William’s post shows the pressure politicians are under to put out content that promotes their message.
“You turn to ChatGPT to speed up some of your content production. The problem with ChatGPT, as we all know, is that you don’t know the results, they don’t know how accurate it is and there’s no fact-checking,” he told CBC News.
The impact of AI on politics and election campaigns is becoming a pressing topic of discussion in Canada and the U.S.
Caroline Xavier, head of the Communications Security Establishment (CSE) — Canada’s cyber intelligence agency — has told CBC News that she’s concerned the use of AI could amplify the spread of misinformation.
Earlier this year, a group of researchers at Columbia University tested how five large language models — including ChatGPT — responded to a set of prompts about primary contests in the U.S.
All five failed to varying degrees when asked to respond to basic questions about the American democratic process — such as where a voter could find the nearest polling place.
McKelvey said that while AI technology like ChatGPT could have some uses for politicians crafting messages, it needs to be better regulated.
“We’ve really left all AI regulation to private companies like OpenAI, who are making up rules on the fly,” he said.
OpenAI, the company that created ChatGPT, provides AI technology to tech companies like Apple and Microsoft. But in January, the company updated its policies and suggested it would restrict the use of its technology by political campaigns and lobbyists.
“We’re still working to understand how effective our tools might be for personalized persuasion. Until we know more, we don’t allow people to build applications for political campaigning and lobbying,” the company said in a media statement.
OpenAI said in May that it’s updating ChatGPT to direct users to official sources for voter information.
Last year, federal and provincial privacy commissioners announced a joint investigation into whether ChatGPT was collecting and disclosing Canadians’ personal information without consent.
McKelvey said discussions about AI technology need also to focus on its impact on the “information ecosystem.”
“What it really is doing is contributing to a more systemic issue where Canada’s had relatively high levels of trust comparatively in media systems and AI could be contributing to a decline in that trust,” he said.
Without further regulation, McKelvey said, gaffes like Williams’s post might be the only way to police AI-generated misinformation.
“Politicians might think twice if their use of this technology gets them in hot water,” he said.