In 1986, British police asked a molecular biologist to help them out. They were investigating two brutal rape-murders in the English midlands and a 17-year-old boy with mental disabilities had confessed. But the confession didn’t quite match up to facts. Fearing they were hounding down the wrong person, the police wanted the biologist to use brand new technology to solve the crime: they asked him to look at the molecules of the boy’s DNA and figure out if they matched samples left on the victims.

When the research was completed, the boy became the first person ever exonerated through DNA testing. Later, a man was found who did match the sample and this perpetrator was convicted.

In the decade that followed, the use of DNA evidence in court became a hotly-contested subject, with legal and biomedical scholars battling out the norms of this radical new form of evidence in each jurisdiction around the world.

Using DNA evidence in criminal court is mostly a no-brainer now, but since then, other forms of evidence have had their own grand entrances (and chaotic upsets) in the judicial system. For example: lawyers and their staff have had to learn rapidly how to process slews of emails and Facebook posts.

Now the next litigating revolution may be bearing down on us: the massive amounts of data available through the Internet of Things.

The Internet of Things (or IoT) is the mass of networked objects which are colonizing each corner of our lives. Each device, gadget, accessory, etc., that includes microprocessors, sensors, and network connectivity is a member of the IoT. We’re used to this with mobile phones, but now we need to think about thermostats, refrigerators, medical equipment, and toys (not to evoke any creepy doll-themed horror movies, but there’s even a Barbie connected to the Internet of Things, tricked out with sophisticated surveillance and learning software).

And this big pile of things is headed to the courts.

How much of a game-changer is this? As attorney Edward McNicholas puts it:

“As people start to have these devices talk to each other, the amount of data and the implications of that will be astounding [. . .]. I think people don’t appreciate that we’re in 1995. We’re right before the web, and regular people don’t appreciate that things are going to change dramatically.”

Echo and Narcissus go to Court

In Greek mythology, Echo is a beautiful, talkative nymph who runs afoul of the gods and gets cursed to only repeat what others say. She falls in love with the beautiful Narcissus, who spurns her because he can only love himself. The tale ends with Narcissus staring into his reflection in a pool, Echo watching and repeating helplessly, until he drowns in his attempt to embrace his reflection.

When I heard that Amazon had decided to name their voice-controlled smart-speaker ‘Echo’ I wondered if this was some subtle criticism of our culture: while we’re narcissistically obsessed with ourselves, this lovely little device repeats us back to ourselves.

But in a macabre revisiting of the poolside story, Amazon’s Echo is recently in the news because of a drowning:

In November of 2015, a former police officer was found dead in an Arkansas hot tub. The homeowner claims the man drowned accidentally and alone, but police suspect foul play. To look for proof, they’ve brought in an unusual witness: the homeowner’s Echo smart-speaker. Since the Echo, like its mythological namesake, is always listening, there’s some chance that the device or the Cloud holds a record of the night’s events.

At the same time, a Canadian personal injury firm is using a woman’s Fitbit (the watch-like gadget that records your physical activity and health details) to prove their client suffered from reduced ability to be active following an accident. (At the same time in the U.S., another woman’s Fitbit has been used as evidence against her allegations of rape.)

Though we’re only seeing a few of these cases now, attorney Michelle Lange writes: “Lawyers can only imagine the impact this will have on legal claims and defenses, with data security and privacy issues coming to mind”

She continues:

Antagonists will protest that data from everyone’s Internet toasters and coffee makers will have minimal relevance in litigation. Could the same insular thinking have argued that social media data in Facebook, Twitter, or LinkedIn would not be impactful in a lawsuit? We need to think more broadly. The “Internet of Things” will only lead to the ediscovery of every thing. It will be a brave, new world of digital law and practice.

GPS information in cars and phones are already commonly used in incident reconstruction. So what about wearable devices that record your heart rate and physiological response at the time of an incident? Or that show how much sleep someone got the night before, to either support or counter claims of sleep deprivation? What if an insurance company won’t settle on a claim because they discover from your client’s wearable device that she walks a lot, though you claim that walking causes her pain? Would you like to shore up a witnesses’ testimony with a time-stamped Google Glass recording of an incident?

Creative and tech-savvy attorneys should be asking questions like these if they hope to stay afloat through this next revolution.

Tomorrow is Here

Law firms are already trying to play catch-up with the onslaught of new technology in the Internet of Things. And they don’t have a lot of time. As far back as 2008 there were already more things connected to the internet than there were people on the planet. Some sources suspect 50 million objects will be uploading data to the cloud by 2020.

Legal tech expert Antigone Peyton writes: “From a litigator’s perspective, there are benefits and risks associated with IoT evidence. These connected objects, combined with big data analytics, can make cases simultaneously clearer and more complicated.” You need to understand this technology — or have a good working relationship with someone who does — in order to navigate these benefits and risks.

Here are seven areas to consider as we move into this brave new world:

1. Introduce IoT to Clients:

Begin talking about the Internet of Things early on with your clients. Attorneys have figured out they don’t want their client posting some things on Facebook—it may be worth talking about the capacity to spread data detrimental to the case through their gadgets as well.

You can also begin thinking creatively about whether your client may have gadgets that could help prove their case, as in the Canadian case with the Fitbit, and ensure they can get access to that information. Also ask if there anything owned by someone close to them or in their workplace that might have gathered information about the incident or their subsequent injuries?

As you interface with your client, the more you personally understand about how data is collected and stored, the better you’ll be able to represent their needs. This is one area where a tech-savvy lawyer can gain a clear advantage over both their competitors and the opposing counsel—or can at least avoid being left behind in the dust with our flip-phones and fax machines.

2. Discover Opposing Parties’ Data

What objects in the Internet of Things might the opposing party have that could give you relevant information about the case? Consider this question in your preservation letter to opposing parties, clarifying the need for them to preserve information they could otherwise thoughtlessly delete in a software upgrade.

In 1970, Federal Rule of Civil Procedure 34 about discoverability was expanded to explicitly include electronic data. In 2006, the rules committee noted: “Since then, the growth in electronically stored information and in the variety of systems for creating and storing such information has been dramatic.”

The 2006 comments further specify: “Electronically stored information may exist in dynamic databases and other forms far different from fixed expression on paper. Rule 34(a) is amended to confirm that discovery of electronically stored information stands on equal footing with discovery of paper documents.”

The comments further clarify that issues of burden, intrusiveness, confidentiality, and privacy apply to electronic data much like they would with any other object, and that the rule “is not meant to create a routine right of direct access to a party’s electronic information system, although such access might be justified in some circumstances.”

3. Preserve Clients’ Possibly-relevant Information

Though it’s widely considered illegal for an employer or insurer to force someone to wear a monitoring device like the Fitbit; if your client is already using it, you may find you need to turn over their information to an opposing party, if it’s requested and the information could be seen as relevant.

In case and until this request is made, this data should be protected and preserved as soon as litigation is anticipated. Federal Rule of Civil Procedure 37(e), adopted in 2006, states that “[i]f electronically stored information that should have been preserved in the anticipation or conduct of litigation is lost because a party failed to take reasonable steps to preserve it, and it cannot be restored or replaced through additional discovery,” the court can demand a variety of sanctions or curative actions.

In a 2015 comment to this rule, the committee notes: “This limited rule has not adequately addressed the serious problems resulting from the continued exponential growth in the volume of such information.” They write that federal courts have been fairly harsh on parties who don’t preserve their electronic information, pushing those involved in litigation into increasingly scrupulous data-maintenance.

But they recognize the scrupulosity can’t continue forever, writing: “Due to the ever-increasing volume of electronically stored information and the multitude of devices that generate such information, perfection in preserving all relevant electronically stored information is often impossible.” The comments recommend that courts consider “the routine, good-faith operation of an electronic information system” before imposing sanctions for each lost kilobyte.

The comments also recommend that the court “be sensitive to the party’s sophistication with regard to litigation in evaluating preservation efforts; some litigants, particularly individual litigants, may be less familiar with preservation obligations than others who have considerable experience in litigation.”

4. Understand Limits

Remember those aerial images from Iraq that supposedly ‘proved’ there were weapons of mass destruction there? Let this be a reminder to us: more information is not always useful if we’re not skilled at interpreting it correctly.

While there are many opportunities with IoT discovery, they come with their share of headaches.

Much e-discovery software, used to comb through emails, isn’t set up to understand computer-speak. It understands human words, but not what objects connected to the Cloud speak to each other. This means that crafting a narrative out of IoT bits and bytes can be a more time-intensive and costly scenario.

Though developers are trying to make information more accessible and shareable, our ability to analyze information will always lag behind the tech industry’s capacity to create new forms of it. Though we should learn how to make sense of data—and form strong working relationships with experts on the subject—we also need to understand that some things will remain illegible.

Writing for The Atlantic, Kate Crawford explains how this is playing out in the Canadian personal injury case that is using Fitbit data as evidence. The plaintiff’s attorneys aren’t simply offering up raw data: rather, they’re going through an analytics company that will compare their client’s information to a general ‘standard’ — “In other words,” writes Crawford, “they specialize in taking a single person’s data, and comparing it to the vast banks of data collected by Fitbits, to see if that person is above or below average.”

However, this concept of a ‘norm’ could be contested in the courts. Research on what ‘healthy’ means is complex and rapidly developing. But analytics companies typically don’t reveal the research their guided by, the data sets they’re using, and the tools they use to analyze it.

And the data itself has its peculiarities. Crawford notes that some wearable devices

count moving your arms around as walking (which is great if you want writing to count as exercise), others can’t easily register cycling as activity. The sleep-tracking functions deploy relatively crude methods to determine the division between light and deep sleep. This “chaos of the wearable” might be merely amusing or frustrating when you’re using the data to reflect on our own lives. But it can be perilous when that data is used to represent objective truth for insurers or courtrooms. And now that data is being further abstracted by analytics companies that create proprietary algorithms to analyze it and map it against their particular standard of the “normal” healthy person.

While expert witnesses on these issues will help explain these complexities to judges and juries, all of this recommends as well a certain amount of humility—not just with wearable health devices, but with the whole range of internet-connected things. Tech and legal scholars agree that we need to give up our fantasy of the ‘perfectly objective witness.’ Crawford concludes:

Ultimately, the Fitbit case may be just one step in a much bigger shift toward a data-driven regime of “truth.” Prioritizing data—irregular, unreliable data—over human reporting, means putting power in the hands of an algorithm. These systems are imperfect—just as human judgments can be—and it will be increasingly important for people to be able to see behind the curtain rather than accept device data as irrefutable courtroom evidence. In the meantime, users should think of wearables as partial witnesses, ones that carry their own affordances and biases.

As we increasingly use devices as evidence, we should think of them as fallible, partial witnesses, with their own situated ‘worldview,’ much as other witnesses have. It’s possible judges in various jurisdictions won’t allow data from IoT devices as evidence, deeming it insufficiently objective and lacking scientific rigor; but there’s also a danger that judges and juries will put too much faith in the devices and the analytics used to understand them, considering this data the whole truth of the situation. Sophisticated attorneys should weigh these complexities in their arguments.

5. Decipher Data Ownership:

The big question with IoT devices and e-discovery is data ownership. These gadgets usually transfer data into the Cloud rather than storing it on the device. Within shifting lines of ownership, the location of data is increasingly complex and vague—does this belong to the consumer or the maker of the device?

For information your client wishes to preserve or to keep confidential, you may want to contractually clarify ownership of information with the Cloud provider.

6. Remember Proportionality:

When discovering opposing parties’ IoT data or offering up your client’s data, don’t lose sight of costs in time and money. Ian Lopez writes in LegalTech News about a hypothetical car accident where someone’s “driving a smart car on a smart road in a smart city using a smart traffic system,” and they hit someone wearing a GPS and health monitoring device. While each smart, networked object in this scenario could offer up reams of information, Lopez asks:

is it really necessary to get all of that data? I mean, it’s clearly relevant. But considering the cost of the case and the amount of time needed to process the data, do you really need to generate a 360-degree simulation of the accident based on sensory data when a witness would be less costly, or tire marks indicate an abrupt stop, etc.?

If standard methods can make an adequate argument, it may not be worth diving into the high tech.

And if the information is difficult and costly to get, you may not be able to demand the information from the opposing party. The Federal Rules of Civil Procedure put in new amendments in 2015 to address this issue. Rule 26(b)(2)(B) states:

Specific Limitations on Electronically Stored Information. A party need not provide discovery of electronically stored information from sources that the party identifies as not reasonably accessible because of undue burden or cost.

And in Rule 37’s notes on 2015 amendments, the committee notes:

Another factor in evaluating the reasonableness of preservation efforts is proportionality. The court should be sensitive to party resources; aggressive preservation efforts can be extremely costly, and parties (including governmental parties) may have limited staff and resources to devote to those efforts. A party may act reasonably by choosing a less costly form of information preservation, if it is substantially as effective as more costly forms.

But reiterating my recurring theme that it’s important to understand these devices and the data they hoover up, the committee then writes:

It is important that counsel become familiar with their clients’ information systems and digital data — including social media — to address these issues. A party urging that preservation requests are disproportionate may need to provide specifics about these matters in order to enable meaningful discussion of the appropriate preservation regime.

Of course, it’s worth pointing out that in many ways IoT data can less costly than other forms of e-discovery: in most cases it is less likely to contain privileged information, and doesn’t require the same scrupulous, individual review that emails do.

But in any event, each situation will have its own issues for accessibility and proportionality. Judges should also take into consideration that information which is retrieved only with difficulty by the client (involving jailbreaking their device and hiring experts to figure out what it says) could be more easily provided in a much clearer form by the device manufacturer. And parties that want to get their paws on difficult data may find they need to present a very compelling argument for its necessity.

7. Safeguard Privacy

This is a general theme for attorneys in the tech world: each new device in your life presents a potential open door on private information—your own and your clients’. When you’re protecting client privacy from opposing parties in cases, you may be frustrated by scant recourses. Evan Schuman, writing for Computerworld, is concerned about the Echo case I mentioned at the beginning of this piece. He notes that Amazon tried to reassure customers by issuing in a statement that it “will not release customer information without a valid and binding legal demand” and that in general the company objects to “overbroad or otherwise inappropriate demands”—but these are empty assertions as long as subpoenas and warrants exist.

Kate Crawford also worries over this issue when she addresses the IoT question. She notes that the Fifth Amendment protects the right against self-incrimination and the Sixth Amendment protects the right, in criminal cases, to be confronted with the witness against you. “Yet with wearables, who is the witness? The device? Your body? The service provider? Or the analytics algorithm operated by a third party? It’s unclear how courts will handle the possibility of quantified self-incrimination.”

But perhaps more frightening are privacy concerns outside of the discovery process, when it comes to corporations gathering huge amounts of our personal information. Schuman also notes: “There are no meaningful federal privacy laws in the U.S., outside of those that keep medical data, sealed court documents and some government records such as IRS tax returns away from prying eyes. Unless that changes, the IoT will make privacy a quaint recollection of our youth.”

And then there’s the issue of information theft. Attorneys have been the target of hacks before, and will continue to sit in the crosshairs of data-thieves. If you’re in a cynical state of mind, the wealth of networked gadgets around us suddenly appears like a prettified ankle monitor, allowing extortionists and thieves to track your every move. (Now is not the time to rewatch that Black Mirror episode).

In addition to promoting the most secure devices and platforms to protect themselves, tech-savvy plaintiff-side attorneys may find themselves positioned to close up loose and leaky systems in consumer protection cases. One current example is the case of Cahen v. Toyota Motor Corp. a class action suit against the car company for selling vehicles vulnerable to hacking. This case, which is interesting because it doesn’t allege hacking actually has happened, but rather points to weaknesses in the system that exposed unsuspecting consumers to unimaginably damaging hacks, is currently appealed to the Ninth Circuit Court of Appeals.

As society and regulators struggle to get a hold on our ever-eroding privacy, plaintiff-side attorneys will continue to play a role in defining an individual’s right to avoid perpetual surveillance.


The proliferation of the Internet of Things has been called the third revolution (after the Industrial Revolution and the creation of the internet itself). But if attorneys begin to learn the language of these gadgets and act as interpreters for their clients, there’s reason to believe we’re better equipped to swim in these new waters than we were to deal with revolutions past.

And it’s worth remembering that amid this binary buzz and technojargon, plaintiff-side attorneys retain their same old job: know how to tell a compelling story to represent our clients’ experiences and their right to legal remedy. The new gadgets are simply new tools for fleshing out our narratives—we need to know their capacities and their limits if we’re going to weave them into our practice, and guard against them being used against us.