The Beginning
Once upon a time, many years ago, websites were simple. They were created using HTML code manually, or with the assistance of software like Dreamweaver or Microsoft Frontpage. The web designer would get the website looking and behaving perfectly and then upload the site to a web hosting platform. Cookies were a snack. Trackers were tough, skilled individuals who could find someone lost in a forest, or locate an escapee from prison. My, how times have changed.
“Web Hits”
If you wanted to know how many “hits” or visitors came to your website, most web hosts included a statistics page that interpreted data from the server’s logs. These logs would catalogue every visit and would include things like the visitor’s IP address, Internet services provider and other information necessary to serve up the website. It was possible to see how many distinct users visited the website, the number of visits to each page and other simple stats.
Increased Coding Sophistication
Then, in addition to HTML, other coding types came into existence. First, there was CSS, which helped streamline how design elements appeared on a website. Then came Javascript, among others. It was javascript, however, that transformed things. Now, it was possible to let the user’s web browser do some computing work instead of the just the server. It was also possible to learn things about the browsing habits and interests of the person using the browser. This was accomplished through small snippets of code known as trackers or “cookies.”
Pandora’s box was found.
Urchin Analytics
Around this visit time, a company known as Urchin created a Javascript-driven “statistics” software program that produced much deeper detail about website visitors and their online habits. To implement the program, a snippet of Javascript was placed on each webpage which tracked users as they navigated throughout the website. It was very effective software, and revolutionary at the time. It was also expensive, so most website owners didn’t take advantage of it.
Then one day, a fledgling search engine named Google did something revolutionary. The company purchased Urchin outright for $30 million and immediately made the software widely available for free. According to a release at the time, Jonathan Rosenberg, vice president of product management, Google said “We want to provide Web site owners and marketers with the information they need to optimize their users’ experience and generate a higher return-on-investment from their advertising spending. This technology will be a valuable addition to Google’s suite of advertising and publishing products.”
Pandora’s box was officially opened.
Google Analytics Goes Viral
Google Analytics quickly became defacto standard for website statistics. It was a best practice to install the code on any new or redesign of a website. I personally have installed Google Analytics on hundreds of websites.
Unbeknownst to all those web developers at the time, was that Google planned to use their fledgling “free” product to essentially spy on, track and follow every internet user across the entire Internet. With Google Analytics installed on just about every website, Google was able to aggregate users’ browsing habits across sites, resulting in vast amounts of data about individual user preferences, habits, purchases and more. Combined with Google’s Adwords advertising product, Google gained even more insight about user purchase intents, what types of campaigns would result in success and more importantly, the economic value of each of those purchases. Google was then able to use that intelligence to set cost-per-click rates for a broad list of services and products.
The Birth of the Internet Giants
Cost per click rates soared for categories like plumbing, which Google knew had high conversion rates. Generally, if one is searching for a plumber, there’s a problem that needs to be solved… now. As of this writing, Google’s cost for emergency plumbing-related terms is $40 per click. Yes, per click. Google has determined that the rates plumbers charge would justify a cost of $100 per lead and that conversion rates for this type of service are north of 50%. Voila, the price is 40 bucks a click. The execs at Google probably believe that this is actually a discounted price and that the amount should be $50. Give them time, they’ll get there.
As Google was gathering troves of valuable data, combining it with their maps and local businesses, another tech company set its sights on mining massive amounts of data gold from across the web. Facebook began attempting to monetize its platform in 2004. Soon, after the company developed the “Facebook pixel.” It was a piece of code that could be placed on a website that could help with tracking how Facebook users were interacting with websites and how effective ads were on the platform. Just like Google Analytics, web developers began installing the Facebook pixel on all websites.
Just as was the case with Google Analytics, the Facebook pixel provided Facebook with large amounts of aggregated user data across the entire Internet. Most Facebook users stay logged in to the platform continually. As a result, Facebook can track through its pixel, user preferences, purchases, interests, everything. And, since each user actually has a profile on Facebook, the company is actually in an even better position than Google to actually “know” its users.
Of course, Google had ten years on Facebook’s surveillance capabilities. Google Analytics was introduced in 2005 and the Facebook Pixel in 2015. At the writing of this article, Google’s been accumulating massive amounts of data for 17 years. That’s a lot of data.
The Creep Factor
It didn’t take long, however, for the creep factor to set in for the average user. The ability to “re-target” users was one of the benefits of this surveillance at scale. It works like this. Say a user visits a website that sells raincoats. The raincoat seller can add a piece of code to their website that lets Google know they’re there. It can also follow the user around as they browse the site. Google can then make a note of the interest and follow the user around the Internet, reminding them of their interest in raincoats. For instance, since Google controls the largest share of advertising across the Internet, it can track users at the raincoat shop and then subsequently show them ads on the Weather Channel.
The Empire Strikes Back
This mass surveillance, coupled with Google and Facebook’s meteoric revenue growth over time piqued the interest of more than one regulator across the globe. Eventually, someone took action. The General Data Protection Regulation (GDPR) was adopted by the Eurozone in 2016. The law empowers EU citizens to have control over how data about them is gathered and used. It was a watershed moment for website owners on both sides of the Atlantic. While the law provides privacy protections for EU citizens, it doesn’t apply to EU websites exclusively. It applies to all websites, regardless of the country where the website is managed or hosted.
So, essentially, if a website hosted and managed in the U.S. attracts visitors from the EU, it may need to comply with the GDPR regulation. The EU has the authority to issue fines or other enforcement actions against websites that run afoul of the law, regardless of geographic location. This initially caused quite a stir among the larger U.S. based content and news sites. Some even blocked access to their websites from EU visitors until they could become compliant with the new regulation.
The Balkanized Approach in the U.S.
Meanwhile, in the U.S. multiple attempts to craft a national privacy law failed for one reason or another. As a result, state legislatures started taking matters into their own hands. California was the first state to pass a privacy law, the CCPA, which provided significant data protections and rights for California citizens. Like the GDPR, the law extended beyond the state, obligating website owners nationwide to comply with the law if certain conditions were met. Soon, other states joined the fray, passing their own versions of privacy laws and requiring compliance beyond state lines.
In the end, the two largest data privacy profligates, Google and Meta are pretty much unaffected by all this legislation. They have crafted policies that basically say, “yeah, we’re going to take your data and do whatever we want with it.” Meanwhile, compliance is required for the rest of the 99% of websites out there.
This isn’t necessarily a bad thing. Google and Meta will eventually be brought into the fold of data privacy respecters, but the bottom line is that companies everywhere will need to take a look at their policies, where all of their data resides, and how it is protected.
Everybody Wants Privacy
Poll after poll has shown that there is broad, bipartisan support for protecting privacy in the U.S. Far-right and far-left are all in agreement – everyone wants their privacy protected. Pretty much nobody wants their privacy sold to the highest bidder, or even worse, stolen by some foreign country.
Businesses that take the high road and get ahead of the coming onslaught of privacy regulations will reap the rewards of greater trust and potentially more business from customers. Here are some steps that can be taken to take the lead in privacy:
- Get a handle on where all data is stored.
Find every app and database where data resides. Check to make sure the data is secure. Ensure that there are viable contracts in place if the data resides with a vendor. Remove duplicated or older, unused databases. - Publish a compliant privacy policy.
A privacy policy should tell the user why you’re collecting data from them, what you do with the data and how you intend to protect it. Users should also have a mechanism whereby they can know what data has been collected and how to delete it. - Remove third-party cookies or add a consent management platform (CMP).
Third-party cookies such as those by Google Analytics or Meta’s Facebook pixel either need to be removed or user consent needs to be obtained prior to loading them on user browsers. Today, there are a number of privacy-first alternatives to Google Analytics. The Facebook pixel offers very little value to website owners and should be discarded unless there is a unique business case to maintain it. - Train staff members that impact data collection on proper data handling.
Marketing, HR and Sales are typically the head of the spear for data collection in most organizations. Create policies and training programs to teach team members how to be responsible when collecting data. - Inform users about your privacy protections.
Don’t just bury your privacy efforts in some obscure privacy policy. Shout it from the rooftops! Let everyone know that your organization is privacy-first. You’ll build trust with users and lower barriers, resulting in more customers and deeper relationships with them.
Reach out to us with any questions,