Media tycoon and billionaire Barry Diller have urged publishers to fight back against AI, saying that the proliferation of AI chatbots might destroy the industry.
During Monday’s Semafor Media Summit in Manhattan, Diller said companies should sue entities for copyright infringement to prevent AI systems from harvesting content and repackaging them.
(This week also saw the release of a new, sweeping draft regulation from China’s top internet regulator, mandating all domestically generated artificial intelligence products to be reviewed for security and content to guarantee they uphold fundamental socialist ideals.)
Diller’s fear is all the world’s knowledge will be able to be swallowed up by the bots and then effectively repackaged. He is the chairman of media behemoth IAC, which owns People, Entertainment Weekly, and dozens of digital news and information sites vulnerable to bots scraping his content on the internet.
The publishing tycoon drew parallels between the current era of artificial intelligence chatbots like ChatGPT and the early days of the internet when news institutions that had relied on subscription revenue began distributing stories for free online, destroying their business models.
Concerns over the effects of AI on national security and the classroom led the Biden administration to announce on Tuesday that it is soliciting public feedback on possible accountability standards for AI systems.
The National Telecommunications and Information Administration (part of the Department of Commerce) is soliciting feedback.
The safety of AI is an open question; President Joseph Biden remarked last week about its safety and national security implications.
The President said technology firms must ensure the security of their products before releasing them to the general public.
ChatGPT was banned in Italy for violating privacy laws. Authorities there and elsewhere in Europe have begun looking more closely at the pros and cons of this technology.
Elon Musk has signed a letter urging the world to pause on AI so that we may study the matter more closely.