Facebook to Train AI Systems Using Police Video - NBC10 Boston
National & International News
The day’s top national and international news

Facebook to Train AI Systems Using Police Video

Find NBC Boston in your area

Channel 10 on most providers

Channel 15, 60 and 8 Over the Air

    processing...

    NEWSLETTERS

    Facebook to Train AI Systems Using Police Video
    Vincent Yu/AP (File)
    FILE - A police officer stands guard at a park outside the Al Noor mosque in Christchurch, New Zealand, March 20, 2019.

    Facebook will work with law enforcement organizations to train its artificial intelligence systems to recognize videos of violent events as part of a broader effort to crack down on extremism.

    Facebook's AI systems were unable to detect live-streamed video of a mass shooting at a mosque in Christchurch, New Zealand.

    The effort will use body-cam footage of firearms training provided by U.S. and U.K. government and law enforcement agencies. The aim is to develop systems that can automatically detect first-person violent events without also flagging similar footage from movies or video games.

    It's also expanding its definition of terrorism to include not just acts of violence attended to achieve a political or ideological aim, but also attempts at violence, especially when aimed at civilians with the intent to coerce and intimidate.

    Death Toll, Damages Climb From Typhoon Hagibis

    [NATL] Death Toll, Damages Climb From Typhoon Hagibis

    The death toll from Typhoon Hagibis climbed to 53 on Tuesday, days after it tore through Japan and left hundreds of thousands of homes wrecked, flooded or out of power. Hagibis caused more than 200 rivers to overflow when it hit the island nation on Saturday.

    (Published Tuesday, Oct. 15, 2019)

    Facebook has been working to limit the spread of extremist material on its service, so far with mixed success . In March, it expanded its definition of prohibited content to include U.S. white nationalist and white separatist material as well as that from international terrorist groups. It says it has banned 200 white supremacist organizations and 26 million pieces of content related to global terrorist groups like ISIS and al Qaeda.

    Extremist videos are just one item in a long list of troubles Facebook faces. It was fined $5 billion fine by U.S. regulators over its privacy practices. A group of state attorneys general has launched its own antitrust investigation into Facebook. And it is also part of broader investigations into "big tech" by Congress and the U.S. Justice Department.

    More regulation might be needed to deal with the problem of extremist material, said Dipayan Ghosh, a former Facebook employee and White House tech policy adviser who is currently a Harvard fellow.

    "Content takedowns will always be highly contentious because of the platforms' core business model to maximize engagement," he said. "And if the companies become too aggressive in their takedowns, then the other side — including propagators of hate speech — will cry out."