Busted by Your Fitbit: How Smart Devices Can Solve Crimes

FitBit Surge sports watch
A Fitbit fitness tracker. Police in Connecticut used a murdered woman’s device to trace her last movements and use the information to charge her husband as the killer

A couple of decades ago, DNA tests were the frontier in solving crimes. But the array of devices we're putting in our homes and on our bodies are quickly becoming a detective's new best friend—at least while we still have detectives. Before long, artificial intelligence should be able to analyze the data pouring in from devices and nail criminals better than any human gumshoe. Time to develop a new TV show: CSI: Robots.

Two recent, well-publicized cases have given us a glimpse of this future. One involved Amazon's Echo device, which is driven by the company's artificial intelligence software, Alexa. An Echo can sit in a home and listen for verbal commands or questions. More than a year ago, James Bates had a friend over to his house in Arkansas and allegedly killed him after the two drank a boatload of vodka. Police found an Echo in the house and wanted Amazon to hand over any recordings or data from the night of the killing, thinking maybe Bates asked it something incriminating, like "Alexa, how do you get blood off an ax handle?" Amazon refused, but Bates's lawyer filed a motion in April saying his client would volunteer the data, which Amazon then sent to prosecutors.

Data from a Fitbit helped crack another case, which began in 2015 after police arrived at a couple's Connecticut home and found the woman shot dead. The husband said there'd been a violent struggle with an intruder, but the Fitbit worn by the wife showed she was walking around the house at the time her husband claimed they were fighting off an attack. Based on this new information, prosecutors charged the husband, Richard Dabate, with murder, in April.

This trend is revving up, because smart devices are increasingly a part of our lives. We're wearing Apple Watches and Nike+ smart shoes that can track our movements, and Snap Spectacles that can take videos of what we see. In our homes, we're installing connected cameras, our smart TVs can tell if we're watching, and AI assistants like the Echo or Google Home listen to what we're saying. Think of all the clues these things could gather about a crime. Samsung even sells a Family Hub smart refrigerator that takes and analyzes photos of what's inside. It might have proactively called the cops on Jeffrey Dahmer.

So much more is coming. Amazon one-upped its Echo by introducing the Amazon Look, which has a camera so Amazon's AI can now gather visual information about you. The initial idea is that you can ask Look to take photos of what you wear so it can get a sense of your style and recommend clothes to buy (from Amazon, as you might expect). The device also captures everything in the background, like maybe that loaded gun on the dresser behind you.

I met recently with a startup called Lighthouse, which is introducing a device similar to Look but is built with the kind of 3-D sensing technology that's helping self-driving cars navigate busy streets. Unlike Look, Lighthouse is always on and watching, and the AI can learn the difference between members of a household and can discern what's going on in the room. For instance, while in a bar with the Lighthouse executives, Chief Marketing Officer Jessica Gilmartin, who has a Lighthouse in her home, pulled out her phone and showed me a demo. She asked, by speaking into her phone, to see images of her kids running. Sure enough, the software showed video of each time her kids ran up the stairs. The company is pitching the product as a home monitor, but it's easy to see how detectives investigating a murder could ask Lighthouse to show them all images of, for instance, someone swinging a fireplace poker at someone else's head.

Another new company, Sunflower Labs, is starting to sell a drone-based "Home Awareness System." You get a little camera-armed drone that resides in a nest outside your house and a couple of sensors you plant in your lawn. If the sensors detect some stranger peeking in your windows, the system can dispense the drone to go take a look. That alone would scare the crap out of most intruders, which will eventually make for an awesome compilation video on YouTube. But, again, this can provide even more data for crime solvers.

On a more industrial level, Axon, the company that makes Tasers, is making a huge push into crime-solving AI. Its approach seems brilliant. The company is offering free body cameras to any police department, because Axon knows the gold is in the data it will get back. The cameras can stream video back to Axon's servers, where that data can teach the AI software about police actions and crime scenes. Axon CEO Rick Smith described the strategy to the blog PoliceOne this way: "Imagine having one person in your agency who would watch every single one of your videos—and remember everything they saw—and then be able to process that and give you the insight into what crimes you could solve, what problems you could deal with. Based on what we're seeing in the artificial intelligence space, that could be within five to seven years."

If you put it all together, we're clearly heading toward a time when AI software can learn about criminal behavior and apply what it knows to data from the many devices that are going to monitor our lives. While it might take a human detective months to sort through a data set, AI can do it in a flash and find the tiniest clues in Look videos or drone alerts or refrigerator contents that most people would miss. The more our world gets digitized and turned into data, the better AI will be at solving crimes. "The crime scene of tomorrow is going to be the internet of things," concludes Scotland Yard's Mark Stokes in an interview with The Times of London.

Now there is this little catch called privacy, which is something the companies and the courts are going to have to work out. Like Amazon in that Arkansas case, most companies are reluctant to hand over user data. But investigators are going to ask for it, and courts are likely to grant warrants. We all have to know the trade-off we're making when we bring these gadgets into our homes or strap them on. Basically, we're voluntarily creating a surveillance state.

Just don't murder any early adopters. It won't turn out well.