It’s a little funny that Will Smith’s celebrity may be getting dinged by an AI act he did. Because AI gained its celebrity from something Will Smith didn’t do.
By now you may have heard the allegations of AI in Smith’s live performance of his new song “You Can Make It” to adoring crowds tearing up and holding signs in an undisclosed location, or even seen the strange smoothings and distortions yourself.
But back in early 2023, a video of Smith eating spaghetti went viral because it was AI. Smith never ate that spaghetti, but that didn’t stop so many of us from oohing over what the new tech could do. A year removed from The Slap, we didn’t know we felt about Will Smith. But we suddenly loved AI.
Two years later, Smith was allegedly using AI to go viral … and walking off a cliff.
The new video is likely — though no one has confirmed it — the result of “upscaling,” a kind of AI enhancement that can make it seem like humans in your shot were either there in greater numbers or with more excitement than they really were.
People tearing up when they weren’t crying, or people holding up signs they never raised or people not really people at all — the whole thing is meant to give an impression of Smith’s popularity. “My favorite part of tour is seeing you all up close. Thank you for seeing me too,” Smith wrote on social. Sleuths who zeroed in on the blended hands and inverted words could only smirk at the irony.
A video posted Aug. 12 to Will Smith’s official YouTube page racked up 450,000 views — but was scrutinized for tell-tale signs of AI enhancement, like too many fingers or signs being held in an un-human way, in a process known as “upscaling.”
Smith’s video in one sense offers a singular phenomenon. Here was a former A-lister many of us had cooled to unwittingly proving the very point he was coming to negate. Has there even been a more cringe act by a massive movie star? Has there been a more sus use of AI? If you’re going to pull a muscle telling us how much people love you, at least find some real people to do it. This seemed so thirsty, so unjiggy.
But we may want to hit pause on the celebrity schadenfreude. Because Smith’s act is not singular at all. It is something universal, or at least soon-to-be universal. Influencers and spinmeisters have been using AI upscaling for years, if quietly, the way you might round up your current salary in a job interview. It’s only going to grow more popular as the tools get better. (And they will — you just need some tweaks to the model and increases in compute to erase these hallucinations.) In fact, when the chapter on the early AI Age is written, the line about this moment is less likely to be, “Remember when Will Smith did something cringily AI?” and more, “Remember when AI was still seen as so cringe that we made fun of Will Smith for it?”
Experts differ on the timeline, but everyone agrees it’s just years if not months before we’ll stop being able to spot an AI video. “You Can Make It” had the particular misfortune of coming out at this interregnum moment: good enough for someone to use but not so good we can’t spot it. That moment will be over soon enough, and, I suspect, so will our pearl-clutching.
The main effect of this new age of the synthetic is that video will stop being a meaningful measure of truth. We have long stopped believing everything we read, and AI image-generators have killed what photoshop wounded. But video until now has been the last bastion of objectivity — incontrovertible evidence that an event took place the way it seemed to.
Once this happens, the industry consequences will follow. Publicists will sputter trying to dazzle us with something we know could be AI (I wouldn’t want to be Tom Cruise’s handler when he shoots his next stunt sequence) or dissuade us that their client didn’t do the thing the video says they did. Brand managers will be at a loss attempting similar damage control. And forget media companies. Your stock-in-trade as a TV news division is landing the video no one else has. Now even if you did, who’d believe it?
The effects on democracy will be even more devastating. If you think political disinformation is bad now, imagine when any video can look like anything. Hany Farid, a UC Berkeley professor who’s spent years studying this stuff, told me just that recently: “It will get to the point where it will be exceedingly difficult to tell AI content without real interventions … [and] if pretty much anybody can create content that is this deceptive, we are in trouble, as a democracy and a society.”
But there is an upside. (Really.) Without a format that can telegraph objectivity, we’ll need to (if we care to) turn to other ways to assure ourselves of the facts: the source of the video. That could mean the human-led content creator will matter more. After years of seeing news brands take a beating in the trust department, they’ll soon become the only hope we have of knowing whether something happened. We no longer will be able to trust the medium. But we may newly believe the media.
The phenomenon will cut across all parts of the landscape. Still, there is a final irony in who’s playing it out — the person who, from The Fresh Prince of Bel-Air to Men in Black to Bad Boys to King Richard, has dominated our visual media for 35 years. Call it The Will Smith Paradox. The man who made us most believe in the power of images now shapes a trend in which we may never trust them again.