B.C. Couple Cites AI-Generated Fake Court Rulings in Condo Dispute, Tribunal Finds

Noah Chen

2/18/20252 min read

A B.C. couple’s attempt to use artificial intelligence to support their condo dispute backfired when the Civil Resolution Tribunal (CRT) found that nearly all of the court cases they cited were fabricated by a chatbot.

According to the CRT’s decision issued last week, Robert and Michelle Geismayr were seeking approval from their Kelowna strata corporation for unauthorized alterations made by the previous owner of their unit. To support their argument, they submitted ten legal cases that they claimed proved strata corporations could not force owners to remove alterations.

However, nine of the ten cases didn’t exist, according to tribunal member Peter Mennie.

“I find it likely that these cases are ‘hallucinations,’ where artificial intelligence generates false or misleading results,” Mennie wrote in his Feb. 14 ruling.

AI-Generated Cases Fooled Owners

The Geismayrs said they obtained the cases from Microsoft Copilot, an AI-powered tool, believing them to be legitimate.

Robert Geismayr said he was shocked and disappointed when he learned the cases were fake.

"It was very disappointing and puts ambiguity about what we can trust," he said.

Although he now understands the risks, Geismayr says he will still use AI for general research—but not for legal or other serious matters.

The case highlights a growing issue of AI-generated misinformation in legal proceedings. Last week, lawyers in Wyoming faced scrutiny after citing fake AI-generated cases in a lawsuit against Walmart. Similarly, in 2023, a B.C. lawyer was ordered to personally compensate opposing counsel after using AI-generated rulings in a family court case.

Unauthorized Condo Alterations Led to Dispute

The Geismayrs purchased their Kelowna condo in 2020, knowing that the previous owner had made unapproved alterations—including adding a loft, moving a fire alarm, and modifying fire sprinklers—without strata approval.

A stop-work order had been issued due to the lack of permits, and the alterations meant the unit could not be rented in the building’s hotel-style rental pool.

To comply with rental guidelines, the Geismayrs sealed off the loft and were later allowed to rent out the unit for three ski seasons.

However, when they formally requested retroactive approval from the strata, it was denied, with the corporation demanding that the loft be fully removed.

The strata council argued that allowing the alterations to remain would set a precedent for other owners to bypass approval processes.

Case Dismissed Due to AI Misinformation

The Geismayrs filed a claim with the Civil Resolution Tribunal, relying on ten AI-generated court rulings to argue that strata corporations could not force owners to remove alterations.

However, the tribunal found that only one case was real, and it was not related to unauthorized alterations.

“The state of the law is very different than what Copilot reported,” Mennie wrote in his decision.

The tribunal dismissed the claim, reiterating that owners cannot reasonably expect retroactive approval for unauthorized alterations.

“The strata’s refusal to approve the alterations does not rise to the level of significant unfairness,” the ruling stated.

The case serves as a cautionary tale about the risks of using AI-generated legal information, reinforcing the need for human verification before relying on such tools in legal disputes.