• caglararli@hotmail.com
  • 05386281520

“Ethnicity recognition” tool listed on surveillance camera app store built by fridge-maker’s video analytics startup

“Ethnicity recognition” tool listed on surveillance camera app store built by fridge-maker’s video analytics startup

The bizarre promotional video promises “Face analysis based on best of breed Artificial Intelligence algorithms for Business Intelligence and Digital Signage applications.” What follows is footage of a woman pushing her hair behind her ears, a man grimacing and baring his teeth, and an actor in a pinstripe suit being slapped in the face against a green screen. Digitally overlayed on each person’s face are colored outlines of rectangles with supposed measurements displayed: “F 25 happiness,” “caucasian_latin,” “M 38 sadness.”

The commercial reel advertises just one of the many video analytics tools available for download on an app store monitored by the Internet of Things startup Azena, itself a project from the German kitchen appliance maker Bosch.

Bosch, known more for its line of refrigerators, ovens, and dishwashers, also develops and sells an entire suite of surveillance cameras. Those surveillance cameras have become increasingly “smart,” according to recent reporting from The Intercept, and to better equip those cameras with smart capabilities, Bosch has tried to emulate the same success of the smart phone—offering an app store through Azena where users can download and install new, developer-created tools onto Bosch camera hardware.

According to Bosch and Azena, the apps are safe, the platform is secure, and the entire project is innovative.

“I think we’re just at the beginning of our development of what we can use video cameras for,” said Azena CEO Hartmut Schaper, in speaking with The Intercept.

Facial recognition’s flaws

Many of the available apps on the Azena apps store claim to provide potentially useful analytics, like alerting users when fire or smoke are detected, monitoring when items are out of stock on shelves, or checking for unattended luggage at an airport. But others veer into the realm of pseudo-science, claiming to be able to scan video footage to detect signs of “violence and street fighting,” and, as The Intercept reported, offering up “ethnicity detection, gender recognition, face recognition, emotion analysis, and suspicious behavior detection.”

Such promises on video analysis have flooded the market for years, but their accuracy has always been suspect.

In 2015, the image recognition algorithm rolled out in Google Photos labeled Black people as gorillas. In 2018, the organization Big Brother Watch found that the facial recognition technology rolled out by the UK’s Metropolitan Police at the Notting Hill carnival registered a mismatch 98 percent of the time. And in the same year, American Civil Liberties Union scanned the face of every US Congress member against a database of alleged criminal mugshots using Amazon’s own facial recognition technology and found that the technology made 28 erroneous matches.

When it comes to analyzing video footage to produce more nuanced results, like emotional states or an unfounded calculation of “suspicion,” the results are equally bad.

According to a recent report from the organization Article 19, which seeks to maintain a global freedom to expression, “emotion recognition technology is often pseudoscientific and carries enormous potential for harm.”

One need look no further than the promotional video described earlier. In the span of less than one second, the actor being slapped in the face goes from being measured as “east_asian” and “M 33 sadness” to “caucasion_latin” and “M 37 sadnesss.”

Of equal concern for the apps are the security standards put into place by Azena on its app store.

Security and quality concerns

According to documentation viewed by The Intercept, Azena reviews incoming, potential apps for their “data consistency” and the company also “performs ‘a virus check’ before publishing to its app store. ‘However,’ reads the documentation, ‘we do not perform a quality check or benchmark your app.’”

That process is a little different from the Apple App Store and the Google Play Store.

“When it comes to Apple, there’s definitely more than just a virus scan,” said Thomas Reed, director of Mac and Mobile at Malwarebytes. “From what I understand, there’s a multi-step process designed to flag both App Store rule violations and malicious apps.”

That doesn’t mean that junk apps don’t end up on the Apple App Store, Reed said—it just means that there’s a known, public process about what types of apps are and are not allowed. And that same premise is true for the Google Play Store, as Google tries to ensure that submitted apps do not break an expansive set of policies meant to protect users from being scammed out of money, for example, or from invasive monitoring. In 2020, for instance, Google implemented stricter controls against stalkerware-type applications.

According to The Intercept’s reporting on Azena though, the company’s review process relies heavily on the compliance of its developers. The Intercept wrote:

“Bosch and Azena maintain that their auditing procedures are enough to weed out problematic use of their cameras. In response to emailed questions, spokespeople from both companies explained that developers working on their platform commit to abiding by ethical business standards laid out by the United Nations, and that the companies believe this contractual obligation is enough to rein in any malicious use.

At the same time, the Azena spokesperson acknowledged that the company doesn’t have the ability to check how their cameras are used and doesn’t verify whether applications sold on their store are legal or in compliance with developer and user agreements.”

The Intercept also reported that the operating system used on modern Bosch surveillance cameras could potentially be out of date. The operating system is a “modified version of Android,” The Intercept reported, which feasibly means that Bosch’s cameras could receive some of the same updates that Android receives. But when The Intercept asked a cybersecurity researcher to take a look at the updates that Azena has publicized, that researcher said the updates only accounted for vulnerabilities patched as late as 2019.

In speaking with The Intercept, Azena’s Schaper denied that his company is failing to install necessary security updates, and he explained that some of the vulnerabilities in the broader Android ecosystem may not apply to the cameras’ operating system because of features that do not carry from one device to another, like Bluetooth connectivity.

A bigger issue

Malwarebytes Labs has written repeatedly about invasive surveillance—from intimate partner abuse to targeted government spying—but the mundane work of security camera analysis often gets overlooked.

It shouldn’t.

With the development of the Azena app platform and its many applications, an entire class of Internet of Things devices—surveillance cameras—has become a testing ground for video analysis tools that have little evidence to support their claims. Emotional recognition tools are nascent and largely un-scientific. “Ethnicity recognition” seems to forever be stuck in the past, plagued by earlier examples of when a video game console couldn’t recognize dark-skinned players and when a soap dispenser famously failed to work for a Facebook employee in Nigeria. And “suspicious behavior” detection relies on someone, somewhere, determining what “suspicious” is, without having to answer why they feel that way.

Above all else, the very premise of facial recognition itself has failed to prove effective, with multiple, recent experiments showing embarrassing failure rates.

This is not innovation. It’s experimentation without foresight.

The post “Ethnicity recognition” tool listed on surveillance camera app store built by fridge-maker’s video analytics startup appeared first on Malwarebytes Labs.