It’s the end of the line for the heavily-maligned Google+ service for consumers. The move to close the site comes after Google was forced to come clean about a three-year-old data breach that exposed private data for tens of thousands of users on the social network.
According to The Wall Street Journal, a bug in Google+ exposed profile data to outside developers from 2015 until it was discovered in March. Although Google quickly fixed the problem, it decided not to tell the public what had occurred, fearing government oversight and repercussions. Now that the breach has been publicized, the company has announced new privacy measures.
Why wasn’t it made public?
Through a spokesperson, Google contends it didn’t inform the public about the data breach because the incident didn’t reach certain “thresholds.”
These included “whether we could accurately identify the users to inform, whether there was any evidence of misuse, and whether there were any actions a developer or user could take in response.”
An internal Google memo from legal and policy staff showed that there was no evidence that outside developers actually misused the data. However, The Wall Street Journal says Google acknowledged “it has no way of knowing for sure.”
The breach in question was found as part of a broader internal audit called Project Strobe. During the process, Google says it discovered a flaw in an API the company created to help app developers access profile and contact information.
The report explains:
In March of this year, Google discovered that Google+ also permitted developers to retrieve the data of some users who never intended to share it publicly, according to the memo and two people briefed on the matter. Because of a bug in the API, developers could collect the profile data of their users’ friends even if that data was explicitly marked nonpublic in Google’s privacy settings, the people said.
As part of the measures announced on Monday, the company has decided to pull the plug on the consumer version of Google+.
In announcing the closure, the company says there were “significant challenges in creating and maintaining a successful Google+ product that meets consumers’ expectations.” Further, Google admits the service had “not achieved broad consumer or developer adoption, and has seen limited user interaction with apps. ”
First launched in 2011 to take on Facebook, the social network never achieved the kind of success Google wanted. Google says that Google+ has low usage and engagement, with 90 percent of user sessions being less than five seconds.
Google has also decided to curb the use of application programming interface, or APIs, for many of its most popular services. These APIs, which had been made available to outside developers, require a user’s permission to access any user information. However, bad actors exist.
The company is also no longer giving third-party developers access to SMS messages, call logs, and other forms of contact data on Android phones. Additionally, only a small number of developers will now be able to build Gmail add-ons going forward.
Google in trouble?
This probably won’t be the last time we hear about Google and whether it does enough to protect user data. Government oversight is increasingly more likely, depending on the results of next month’s mid-term elections in the U.S. Class action lawsuits could get filed because Google decided to hide this information for over six months.
I’m not one to promote government oversight. However, stories like this and the Facebook Cambridge Analytica fiasco have convinced me that something more needs to be done so that user data is better protected.
What do you think should happen to Google? Are its new security measures enough? Let us know your thoughts below.