A bipartisan group of state House lawmakers want new regulations for artificial intelligence.
The bill would make it a misdemeanor to create what's known as "deepfake" AI content. That would include deceptive videos where someone appears to be saying or doing something they never said or did. People misrepresented in "deepfake" content would also have the ability to sue.
The bill would also limit lawsuits against AI developers over errors made by AI products. Someone suing a licensed professional over a mistake, for example, couldn't also sue the AI developer whose products were used by the professional in the faulty work.
Rep. Jake Johnson, R-Polk, is sponsoring the bill with Democratic Reps. Zack Hawkins and Vernetta Alston.
"What we don't want to do is over-regulate to where North Carolina gets a bad reputation for a bad place to do business, developing or selling this technology or even using it in state government," Johnson said, adding that the legislation is a work in progress that could change as he receives more feedback.
"We would rather get this done the right way and make North Carolina the true place that people want to come with new business when it comes to developing and installing these AI products," he added.
Several legislators were interested in how the bill would affect campaign ads that use fake AI imagery to misrepresent a politician's words or actions, such as a "fake mugshot." Johnson said the bill would allow AI in ads for things like music or stock footage, but couldn't use AI to create misinformation about the main subject of the ad.
Other legislation has been filed in recent years to address deepfake campaign ads, but none have become law so far.
The bill passed its first committee vote Tuesday and now heads to the House Judiciary 3 Committee.