US Senator Schumer: AI meeting discussed urgent actions needed before election

U.S. Senate Majority Leader Schumer hosts tech leaders and experts at AI forum at the U.S. Capitol in Washington

By Moira Warburton and Kanishka Singh

WASHINGTON (Reuters) - U.S. Senate Majority Leader Chuck Schumer said on Wednesday that an artificial intelligence forum he organized with U.S. lawmakers and tech CEOs focused on urgent actions that are needed before the 2024 elections.


Several governments are considering how to mitigate the dangers of the emerging technology. U.S. lawmakers want safeguards against potentially dangerous deep fakes such as bogus videos, election interference and attacks on critical infrastructure.


Schumer said there were discussions on some immediate actions needed before the next U.S. elections in 2024.

"The issue of actually having deep fakes where people really believe that somebody, that a campaign was saying something when they were the total creation of AI" was discussed in the meeting, he told reporters.

"We talked about watermarking. So you know, a lot of things that have to be done, but that one has a quicker timetable maybe than some of the others and it's very important to do."


* Schumer's comments came after American technology leaders including Tesla CEO Elon Musk, Meta Platforms CEO Mark Zuckerberg and Alphabet CEO Sundar Pichai met with lawmakers at Capitol Hill on Wednesday for a closed-door forum that focused on regulating AI.

* Some companies have committed to developing a system to "watermark" all forms of content, from text, images, audios, to videos generated by AI so that users will know when the technology has been used.

* Such watermark, embedded in the content in a technical manner, presumably will make it easier for users to spot deep-fake images or audios that may, for example, show violence that has not occurred, create a better scam or distort a photo of a politician to put the person in an unflattering light.

(Reporting by Moira Warburton; writing by Kanishka Singh; Editing by Leslie Adler and David Gregorio)