While at CES last week, I saw several companies showing off their video cameras and showing off their AI capabilities by superimposing stick figures on pictures of people in booths. As each person walked by, their picture was superimposed with a stick figure showing how well the computer’s AI captured their movements.
I’ve seen stick figures in a fall-detection and people-tracking lamp from a company called Nobi, and in demos from at least two companies trying to sell cameras to retailers so they can track where customers stand in the store. When I saw these stick figures, I wondered if they were the key to bringing more privacy in a world determined to put cameras everywhere.
While many of us are familiar with the bounding box placed around cars or people in AI demonstrations to show what the computer is tracking and trying to identify, few of us know what the computer “sees” when AI is used to track. activities using the camera.
However, the image above shows what the cameras can see and what they can output while still delivering relevant information. Sony Semiconductor unveiled the image alongside the launch this week of a new AI imaging platform with Microsoft that will allow companies to deploy, train and control AI models on cameras powered by Sony’s AITRIOS sensor platform.
I’m writing about this because after casual use of cameras and AI capabilities at CES that made me feel like I was in a reality TV show or a dystopian sci-fi novel, I was ready to see both privacy-preserving versions. learn about camera technology and understand how they work.
I’ve long been hesitant about the proliferation of cameras in the IoT, but I’ve grown to embrace them—mainly because I don’t feel like I have much of a choice. Cameras can provide a ton of data and at a lower cost than other sensors, meaning they will be increasingly ubiquitous.
Indeed, we are all used to being photographed by ordinary people using doorbell cameras, municipal cameras, dash cams and even their smartphones. But when those photos are taken, they’re stored in the cloud, easily searchable and customizable. In other words, being in a public space now carries the risk that anything you do can be captured, stored, and shared without context, without consent, and then forever searchable.
Being out in public, in other words, has become a riskier proposition than it was 20-30 years ago. A trip to the store or a potluck has the potential to draw as much individual attention as celebrities walking the red carpet.
I don’t know if people can really live like this. I don’t want to and my share is incredibly low. Sure, I’ve collapsed on doorsteps and had moments of rage on a crowded subway platform, clips of which I wouldn’t want to see plastered all over the internet. But I’m not hiding my sexuality from a conservative boss, avoiding a stalker, or seeking asylum in another country to escape political persecution at home. Cameras in public places can serve countless valuable functions, but they can also cause irreparable harm.
So I wanted to see the stick figures at CES and learn about Sony’s latest technology, which promotes image processing in the camera itself rather than sending it to the cloud. Companies that choose to use cameras with Sony’s AITRIOS technology, or developers who build models for use with AITRIOS-enabled cameras, don’t have to choose the most privacy-preserving settings, but I like that Sony has made them more accessible.
Often times, developers or camera buyers pick what’s easy and readily available off the shelf to use for their use case. Controlling image processing in a privacy-preserving way in-camera, guaranteed to support an easily accessible proprietary algorithm with native machine learning, gives end users an option they didn’t have before.
I am writing this so that everyone knows that they have to accept that choice. Watching a bunch of stick figures move around a trade show floor or pick up their groceries can still be used to count people, track suspicious behavior, monitor customer interest, and even detect security issues, but without creating a recognizable image that can haunt many people. eternal man. Let’s all take that option and make it the norm.