Every time we use social media, sign up for a mailing list, or download a free app onto our phones, we agree to the provider’s terms of use.
While many of these agreements are unnecessarily dense and challenging to process, they do serve one very specific role for everyday people like you and me—they set the terms for how a company can use the personal data they collect from us. Typically, that data gets used in one of three ways:
In all three of these scenarios, there are specific parameters for how companies handle, store, and distribute your personal data. Yet, in a work-from-home world, it’s become more and more difficult for companies to enforce the proper application of personal data. Data misuse occurs when individuals or organizations use personal data beyond those stated intentions. Often, data misuse isn’t the result of direct company action but rather the missteps of an individual or even a third-party partner. For example, a bank employee might access private accounts to view a friend’s current balance, or a marketer using one client’s data to inform another customer’s campaign.
To be clear, data misuse isn’t necessarily theft—theft occurs when a bad actor takes personal data without permission—data misuse is when legitimately collected information is applied in a way beyond its original purpose. Typically, these instances are less malicious than an insider threat selling company data to a third party and instead take a more negligent approach. In broad strokes, data misuse tends to fall into three categories:
Commingling happens when an organization captures data from a specific audience from a specific stated purpose, then reuses that same personal data for a separate task in the future. Reusing data submitted for academic research for marketing purposes or sharing client data between sister organizations without consent are some of the most common commingling scenarios. Commingling often occurs out of ease of access—marketers and business owners already have the data and assume that, since they collected it, they are entitled to use it at their own discretion.
Data misuse for personal benefit occurs when someone with access to personal data abuses that power for their own gain. Whether simple curiosity or as a competitive advantage, this type of misuse is rarely done with malicious intent. This method regularly involves company employees moving data to personal devices for easy access, often with disastrous results.
Ambiguity occurs when organizations fail to explicitly disclose how user data is collected and what that data will be used for in a concise and accessible manner. Traditionally, organizations use this strategy because they were unsure how they wanted to use customer data but still wanted to collect it. However, ambiguity leaves the terms of use wide open to vague interpretation, giving the organization a blank check to use customer personal data as they wish.
While the intention may be simple, the stakes can be incredibly high. The cost of data misuse can range from thousands to billions of dollars in fines, not including ransomware or settlements resulting from the misuse. We’ve seen data misuse cases take center stage multiple times in recent years, and in every instance, there have been significant ramifications for the company and its customers.
Perhaps the most infamous example of data misuse, in 2018, news outlets revealed that the UK political consulting firm acquired and used personal data from Facebook users that was initially collected from a third party for academic research. In total, Cambridge Analytica misused the data of nearly 87 million Facebook users—many of whom had not given any explicit permission for the company to use or even access their information. Within two months of the scandal, Cambridge Analytica was bankrupt and defunct, while Facebook was left with a $5 billion fine by the Federal Trade Commission.
While the financial impact of data misuse shouldn’t be understated, perhaps the greatest business impact comes in the loss of trust between the company and its audience. It is entirely reasonable to expect the companies that handle our data to do so securely and under the agreed terms. Anything sort of that agreement is a massive violation of trust between the people and the service provider—trust that is not easily rebuilt. Cambridge Analytica folded in less than three months, Google is still facing constant criticism, and Uber will be audited for the better part of the next two decades.
While these actions may lack the malice of a traditional black hat cyberattack, data misuse increases the opportunities for these criminals to access private data. Many instances of data misuse start with employees or legitimate third-party vendors transferring company data from a secure server onto a personal device with less stringent data security features. Even the most robust network security provisions are irrelevant once data leaves the secure perimeter. Once that personal data—or access to it—is controlled by a more susceptible device, cybercriminals have a much easier path to accessing the personal data they desire.
Often, data misuse boils down to ignorance and negligence. However, as our digital footprints continue to grow and evolve, the necessity for responsible digital hygiene extends to every citizen of the internet—not just IT professionals. That starts with improving our general online practices so that we as users are more selective about the companies we trust with our data and that we, as professionals, are treating our customers’ data with the same care we would our own.
2. Practice conscious digital hygiene
For organizations and individuals alike, putting your faith in the wrong partner can have disastrous results. As we saw on Facebook and Marriott, poor practices by a third-party vendor can not only compromise entire organizational networks but can sully the trust between brands and their customers in an instant. Likewise, we ought to carefully measure the quality of the places where we share our personal data.
While we can refine and perfect our online habits to prevent our own potential misuse, we rarely get to set data policies for the companies we frequent. We as users and customers and contributors must hold the brands we trust accountable for maintaining those expectations. Change never happens out of complacency; whether it’s Big Tech or Wall Street, the only way organizations create serious policy around data misuse is when their customers demand it. Organizations should have basic security structures like behavior alerts and access management tools complemented by need-to-know access and zero-trust architectures. Likewise, we as consumers have a right to clear data collection policies and transparent use cases.