Tech
Figma pulls AI design tool for seemingly plagiarizing Apple’s Weather app
Web and app design toolmaker Figma has temporarily pulled its generative AI “Make Design” feature because it seems to think Apple’s Weather app is the be-all and end-all of mobile forecasting.
The AI text-to-app-design feature‘s affinity for mimicking Apple’s design language was discovered by NotBoring Software founder Andy Allen after he asked the platform to generate a “not boring weather app.”
Allen said that the generative app would consistently offer near-replicas of Apple’s own weather application, which ships with all iOS devices. The behavior led Allen to speculate that Figma had used existing app designs to train the service. “Figma AI looks rather heavily trained on existing apps,” he wrote in a follow up post.
Following the discovery, Figma CEO Dylan Fields wrote in a Xitter thread that the biz would “temporarily disable the Make Design feature” until fixes are made to prevent this behavior.
Fields also attempted to dismiss allegations the service had been trained on popular third-party app designs. “The Make Design feature is not trained on Figma content, community files, or app designs. In other words, accusations around data training in this tweet are false,” he wrote in response to Allen’s findings.
Yet Fields went on to say the feature is built using off-the-shelf large language models which work in conjunction with “design systems” that Figma commissioned. The problem, he explained, lies with these “design systems,” adding that this aping of Apple’s weather software could have been prevented with additional quality assurance steps.
We might take that to mean Figma commissioned a bunch of designs to train its generative tool, and some of that design work looked a lot like Apple’s, hence the feature’s output.
“I hate missing the mark, especially on something I believe is so fundamentally important to the future of design,” Fields closed.
Speaking to The Register, Allen said he thought Field’s explanation made sense, though noted he never claimed the service was trained on user or community data.
“The real issue between the GenAI companies and creators is that none of the companies have been open about how these models are trained, with what data, and what rights were secured. It’s like the fast food of creativity where all the ingredients and processes are hidden away,” he lamented.
Allen also made it clear that he isn’t opposed to generative AI in app design.
“GenAI in general seems to be fine at making mid stuff for reference, but I find it’s mostly not usable for most finished work. It’ll get much better over time and has a long future ahead,” he said, adding that there are other AI features Figma has introduced – like auto-naming layers and localization – which are fantastic. “The Make Designs feature probably just needed a bit more time to bake.”
While Fields insists the Make Design feature wasn’t trained directly on existing apps – just blueprints that closely resemble them, perhaps – the service’s behavior raises questions over who is responsible if a model generates something that arguably violates copyright.
Several tech giants, including Google and Microsoft, have extended limited legal protections against copyright claims for users of its generative AI services.
Figma didn’t directly address The Register‘s questions regarding how and why the Make Design feature behaved the way that it did, nor what legal protections it currently has or plans to offer users in the future. Instead, a spokesperson directed us to Fields’s Xitter post and a page detailing its AI approach.
Considering Make Designs behavior, Allen in a separate missive suggested that users of the service “thoroughly check existing apps, or modify the results heavily” to avoid potential future legal trouble. ®