Science & Tech

Japanese company creates AI that can spot shoplifters before they steal

A controversial new software developed by Japanese startup Vaak could be used to identify potential shoplifters based on their body language.

The system is trained to recognize ‘suspicious’ activities such as fidgeting or restlessness in security footage, according to Bloomberg Quint.

While it’s designed to crack down on theft, with the idea being that staff can approach a potential thief once alerted, predictive policing efforts have sparked concerns that people may be unfairly targeted as a result of racial and other biases.

A controversial new software developed by Japanese startup Vaak could be used to identify potential shoplifters based on their body language.

Vaak’s criminal-detecting AI can alert staff to suspicious behaviour via smartphone app once it’s spotted something in the CCTV stream, according to Bloomberg.

The Minority Report-style system was used last year to successfully track down a person who had shoplifted from a convenience store in Yokohama.

Ideally, however, the startup is aiming for its technology to be a preventative approach.

Vaak says its AI can distinguish between normal customer behaviour and ‘criminal behaviour,’ such as tucking a product away into a jacket
without paying.

Related Post

But, it can also detect what could be the warning signs of a theft before it actually happens.

In this case, staff could be alerted and sent over to approach that person in hopes to thwart stealing by asking if they need help, according to Bloomberg.

Vaak is now testing in dozens of stores around Tokyo, and says the
technology could be expanded to include applications outside of crime
prediction, including video-based checkout systems.

Predictive policing technology has grown in recent years, with secretive trials in China and even some parts of the US.

In 2018, it was revealed that controversial Silicon Valley startup
Palantir has been working with the New Orleans Police Department to test a system that predicts where crimes are more likely to occur, and who is most likely to commit them.

But, experts warn these algorithms will suffer biases as a result of their training data.

Source: https://www.dailymail.co.uk

Recent Posts

What Did the Inquisition Cover Up? The Secrets Hidden by Historians and the Church

History, they say, is written by the victors. But what happens when the victors have…

3 months ago

The Mysterious Visitor of 1985: What Soviet Astronomers Witnessed—and Why We Still Don’t Understand It

On August 7, 1985, a group of Soviet astronomers made a discovery that would baffle…

3 months ago

The Forces That Rule the World and Humanity’s Role in a New Era

In the opening months of 2025, the world stands at a pivotal crossroads, a moment…

3 months ago

Haunting Snapshot: The Ghostly Figure That Chilled a Night by the Fire

Imagine a crisp, moonlit night, the kind where the air is thick with mystery and…

3 months ago

Has Nibiru Finally Been Found? Astronomers Spot Mysterious Object in Deep Space

In a stunning turn of events that has captivated both professional astronomers and skywatching enthusiasts,…

3 months ago

Explosive Vatican Revelation: Secret Document on UFOs and Teleportation Lands in the Hands of New Pope Leo XIV

A century-old secret may soon see the light of day. Deep within the labyrinthine Apostolic…

3 months ago