US FTC to Enforce Law to Rein in Risks Posed by AI as Calls for Regulation Grow

The Federal Trade Commission’s chief said the agency was committed to using existing laws to rein in some of the dangers of artificial intelligence, such as enhancing the power of dominant firms and “turbocharging” fraud.

“Although these (AI) tools are novel, they are not exempt from existing rules, and the FTC will vigorously enforce the laws we are charged with administering, even in this new market,” FTC Chair Lina Khan wrote in an opinion piece in the New York Times on Wednesday.

The sudden popularity of Microsoft-backed OpenAI’s ChatGPT this year has prompted global calls for regulation amid concerns about its possible use for wrongdoing even as companies seek to use it to enhance efficiency.

She described the agency as “well equipped” to handle the job.

One risk she noted was that firms that dominate cloud services and computing would become even more powerful as they help startups and other firms launch their own AI. AI tools could also be used to facilitate collusion to raise prices.

Khan expressed concern that generative AI, which writes in conversational English, could be used to help scammers write more specific and effective phishing emails.

“When enforcing the law’s prohibition on deceptive practices, we will look not just at the fly-by-night scammers deploying these tools but also at the upstream firms that are enabling them,” she wrote. 

© Thomson Reuters 2023  


Xiaomi launched its camera focussed flagship Xiaomi 13 Ultra smartphone, while Apple opened it’s first stores in India this week. We discuss these developments, as well as other reports on smartphone-related rumours and more on Orbital, the Gadgets 360 podcast. Orbital is available on Spotify, Gaana, JioSaavn, Google Podcasts, Apple Podcasts, Amazon Music and wherever you get your podcasts.
Affiliate links may be automatically generated – see our ethics statement for details.

Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here