![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
This grew out of a rant on a semi-private email list. That list is full of high-power geeks and I can assume they care about this stuff. Maybe my general readership here doesn't. You are not a bad person if you don't care and skip this.
Someone posited that Zuckerberg was "between a rock and a hard place" in his testimony on Capitol Hill. That's completely false. The only hard thing about that appearance was that the Congresscritters bounced over a wide array of topics, from the Cambridge Analytica data breach to privacy to security to fraud to past failures. Facebook's motto is "move fast and break things". The unspoken coda is "...and then meaninglessly apologize afterward while changing nothing."
Zuckerberg is not stupid. He realizes this Congress can't get its act together to pass a budget. The odds of them agreeing on "regulation" as he vaguely called for are zero. However, while Congress flails and gets distracted by... oh, I dunno, the likelihood that our Dear Leader will get us into an actual shooting war with Russia or North Korea or maybe actually find a stooge willing to fire the person who appears to have both the authority and integrity to uncover how a corrupt foreign-funded group of oligarchs hijacked our government, Zuck gets to shrug and say "Well, you saw I called for regulation; we're waiting on the government now."
Mr not-stupid Zuckerberg also controls the vast majority of actual working Facebook stock. Even if he steps back from day-to-day running of the company (which I expect him to do; it'll look good and free him up to do more of what he actually wants to do) he'd still be the real power there and have things his way no matter who sat in the hot seat.
The only downside for him I see from this is that he's likely going to have to quash or at least long-term backburner his political ambitions. If you're reading this and going "what?" then stop and think. Imagine that you controlled a tool some third party kind of ineptly used to get a really stupid and venal version of Forrest Gump elected president of the US. You, however, have access to ALL of it. You know how it works, and can instruct the engineers to make it work better for you. You are not stupid and if you think you couldn't do a better job of this than DJT you're just not thinking big enough. Zuck has the ego, the money, and the means. Right now, though, his name is mud. Let's see how well this prediction has aged in 2028 when (if I've done my math right) he will be 44 years old. Obama was 47 when he won his first presidential election.
Now let's turn to the question of what Facebook actually could do, if it wanted to solve this problem. It's completely capable of doing so - don't let anyone tell you otherwise - but it would require significant revamp to its business model. The short form is that FB (and Twitter and pick-your-favorite-social-platform) could operate as a fiduciary. That is, an entity that holds something of value (property, power, information) in a relationship of trust for another.
Facebook would become a fiduciary for our personal information and by default that information would not be released to others, but would be released for specific purposes to known individuals and named groups. The only widespread model we have in America right now for information fiduciaries is HIPAA. This regime enables for-profit entities to hold sensitive data (medical records, mostly), use that data for its own benefit and, under agreements, share those data with authorized other for-profit entities. Medical privacy isn't perfect, nor do I think HIPAA is without flaws. But right now it's the best model we have. Banking secrecy laws such as you find in places like Switzerland might also be another fiduciary model but I'm not as familiar with them.
Imagine, then, that Facebook was your personal info fiduciary. There would be language saying that accepting a "friend" request included permission for FB to share some data with that person, etc. You'd sign forms (much as you do when you visit a doctor) that allowed Facebook to share data with partners. FB would profit because other companies would pay to become such authorized partners and get access to those juicy data. Probably not enough, though. I imagine that FB would end up being some kind of "freemium" business model, where you'd pay something like $5-10/month to have Facebook act as your fiduciary. You pay fees to your bank and broker now - larger fees than that much of the time. If you didn't want to pay, Facebook could behave as it does now, sharing your data freely with whoever.
Changing over to this kind of model wouldn't be easy, or cheap, but I think it would be much more profitable for the company in the long run. There's also a significant first-mover advantage to be had here. Setting the standards for this kind of thing - even if they were released freely for others to implement - would be an advantage. You'd have the first and best implementation.
This also doesn't require any sort of government regulation. HIPAA and many fiscal fiduciaries work with the backing of the government, which is good but not a requirement. Third-party services such as auditing companies and insurance companies exist to spread risk around and increase trust. Financial markets are largely self-regulated, with only some oversight from the Feds, and do just fine. Facebook could contract with existing companies to have audits of its (hypothetical future) fiduciary practices, and to have insurance against breaches. Of course those business expenses would get passed on to the users and partners; Facebook would have to lay out money up front but would expect to make it back once the system was fully up and running.
In summary, there's nothing stopping Facebook from solving this problem today - certainly the lack of government "regulation" is at best a red herring. Neither they, nor Zuckerberg personally, are in any sort of difficult situation. Solving this problem could be quite profitable, which leads me to think that either they're working on this kind of a solution or more likely they just don't care and will sit back and let the money continue to roll in while ignoring all the shit they've broken because there's no actual consequence to breaking such things.
Someone posited that Zuckerberg was "between a rock and a hard place" in his testimony on Capitol Hill. That's completely false. The only hard thing about that appearance was that the Congresscritters bounced over a wide array of topics, from the Cambridge Analytica data breach to privacy to security to fraud to past failures. Facebook's motto is "move fast and break things". The unspoken coda is "...and then meaninglessly apologize afterward while changing nothing."
Zuckerberg is not stupid. He realizes this Congress can't get its act together to pass a budget. The odds of them agreeing on "regulation" as he vaguely called for are zero. However, while Congress flails and gets distracted by... oh, I dunno, the likelihood that our Dear Leader will get us into an actual shooting war with Russia or North Korea or maybe actually find a stooge willing to fire the person who appears to have both the authority and integrity to uncover how a corrupt foreign-funded group of oligarchs hijacked our government, Zuck gets to shrug and say "Well, you saw I called for regulation; we're waiting on the government now."
Mr not-stupid Zuckerberg also controls the vast majority of actual working Facebook stock. Even if he steps back from day-to-day running of the company (which I expect him to do; it'll look good and free him up to do more of what he actually wants to do) he'd still be the real power there and have things his way no matter who sat in the hot seat.
The only downside for him I see from this is that he's likely going to have to quash or at least long-term backburner his political ambitions. If you're reading this and going "what?" then stop and think. Imagine that you controlled a tool some third party kind of ineptly used to get a really stupid and venal version of Forrest Gump elected president of the US. You, however, have access to ALL of it. You know how it works, and can instruct the engineers to make it work better for you. You are not stupid and if you think you couldn't do a better job of this than DJT you're just not thinking big enough. Zuck has the ego, the money, and the means. Right now, though, his name is mud. Let's see how well this prediction has aged in 2028 when (if I've done my math right) he will be 44 years old. Obama was 47 when he won his first presidential election.
Now let's turn to the question of what Facebook actually could do, if it wanted to solve this problem. It's completely capable of doing so - don't let anyone tell you otherwise - but it would require significant revamp to its business model. The short form is that FB (and Twitter and pick-your-favorite-social-platform) could operate as a fiduciary. That is, an entity that holds something of value (property, power, information) in a relationship of trust for another.
Facebook would become a fiduciary for our personal information and by default that information would not be released to others, but would be released for specific purposes to known individuals and named groups. The only widespread model we have in America right now for information fiduciaries is HIPAA. This regime enables for-profit entities to hold sensitive data (medical records, mostly), use that data for its own benefit and, under agreements, share those data with authorized other for-profit entities. Medical privacy isn't perfect, nor do I think HIPAA is without flaws. But right now it's the best model we have. Banking secrecy laws such as you find in places like Switzerland might also be another fiduciary model but I'm not as familiar with them.
Imagine, then, that Facebook was your personal info fiduciary. There would be language saying that accepting a "friend" request included permission for FB to share some data with that person, etc. You'd sign forms (much as you do when you visit a doctor) that allowed Facebook to share data with partners. FB would profit because other companies would pay to become such authorized partners and get access to those juicy data. Probably not enough, though. I imagine that FB would end up being some kind of "freemium" business model, where you'd pay something like $5-10/month to have Facebook act as your fiduciary. You pay fees to your bank and broker now - larger fees than that much of the time. If you didn't want to pay, Facebook could behave as it does now, sharing your data freely with whoever.
Changing over to this kind of model wouldn't be easy, or cheap, but I think it would be much more profitable for the company in the long run. There's also a significant first-mover advantage to be had here. Setting the standards for this kind of thing - even if they were released freely for others to implement - would be an advantage. You'd have the first and best implementation.
This also doesn't require any sort of government regulation. HIPAA and many fiscal fiduciaries work with the backing of the government, which is good but not a requirement. Third-party services such as auditing companies and insurance companies exist to spread risk around and increase trust. Financial markets are largely self-regulated, with only some oversight from the Feds, and do just fine. Facebook could contract with existing companies to have audits of its (hypothetical future) fiduciary practices, and to have insurance against breaches. Of course those business expenses would get passed on to the users and partners; Facebook would have to lay out money up front but would expect to make it back once the system was fully up and running.
In summary, there's nothing stopping Facebook from solving this problem today - certainly the lack of government "regulation" is at best a red herring. Neither they, nor Zuckerberg personally, are in any sort of difficult situation. Solving this problem could be quite profitable, which leads me to think that either they're working on this kind of a solution or more likely they just don't care and will sit back and let the money continue to roll in while ignoring all the shit they've broken because there's no actual consequence to breaking such things.
no subject
Date: 2018-04-11 03:58 pm (UTC)President Obama was 47, not 43, when he won his first presidential election.
Also, speaking as someone with a medical condition highly correlated with amputation, please don't use "lame" to mean "bad".
no subject
Date: 2018-04-11 04:24 pm (UTC)Sorry about the other word; I'll go change that now.
no subject
Date: 2018-04-11 04:53 pm (UTC)He was a senator from 2005 to 2008.
no subject
Date: 2018-04-11 05:09 pm (UTC)no subject
Date: 2018-04-12 01:23 pm (UTC)no subject
Date: 2018-04-12 08:40 pm (UTC)no subject
Date: 2018-04-12 09:11 pm (UTC)I found a couple clips from a FB investor call where they (Facebook Chief Financial Officer David Wehner) seem to be thinking that GDPR requires (possibly pop-up?) permission screens for users when new features are rolled out. So people would get to opt out of every feature and every feature would get another checkbox in the privacy screens... oy.
no subject
Date: 2018-04-12 09:02 pm (UTC)I assume you mean the new GDPR rules? I'm not super-familiar, but what I've read says that they need to offer (European) users a new opt-out option and, if they mishandle those users' data then they could be subject to fines.
It's definitely a better regime for users than the US. However, I think it's a fundamentally wrong-headed way to go. For one thing, we've already seen that people can't handle massive numbers of options and the vast majority of people tend to leave things with default settings. Unless GDPR was to require that to default to "out" most people aren't going to check it. I don't think they'd call it "opt out" if the default was out.
Furthermore, this just assumes that FB will treat people who opt out exactly the same and eat the loss. The company is pretty profitable, so they might do that. Or they might react as sites have when they think you're using adblocking and instead you get chunks of ugly black on your screen with overlay text like "We're sorry, we can't show you the cool photo your friends Anna, Ben, Casey, and Dan have reacted to because that would violate your privacy settings. To adjust this preference, go here -->>"
Some people will ignore that, but people are on FB to be social and if the site makes it apparent that your settings are causing you to miss out on the social with your friends then I expect the majority of people will adjust their settings. So now we have a very small number of people who are European, who took the time to change the default setting, and who don't care enough about missing out on social with their friends. I'm guessing the impact of that sliver of users on FB's revenue is down in the noise somewhere and nobody notices.
A more interesting question is what happens when the next Cambridge Analytica mess happens. First, let's assume that EU regulators are diligent and can overcome FB's battery of high-powered lawyers. (That's not unreasonable; they seem to have succeeded against Google several times.) Now they slap FB with a big-ass fine. FB, in turn, files a lawsuit alleging that Future Cambridge Analytica has to pay for this fine. They ask the judge to stay the fine while their suit against FCA winds its way to a conclusion. This is actually kind of reasonable since I expect a court would buy that FCA are actually to blame for the data breach since FCA violated the terms of service. At a minimum it's years of litigation, during which FB doesn't have to pay a dime. Or whatever 1/10th of a Euro is.
I'm sort of sad that GDPR doesn't mandate some kind of insurance because that's one of the ways I know to get people to change behavior and business practices. (That said, the history and practice of malpractice insurance is a strong argument against the effectiveness of this kind of liability insurance, but I digress.)
no subject
Date: 2018-04-12 09:15 pm (UTC)