This grew out of a rant on a semi-private email list. That list is full of high-power geeks and I can assume they care about this stuff. Maybe my general readership here doesn't. You are not a bad person if you don't care and skip this.
Someone posited that Zuckerberg was "between a rock and a hard place" in his testimony on Capitol Hill. That's completely false. The only hard thing about that appearance was that the Congresscritters bounced over a wide array of topics, from the Cambridge Analytica data breach to privacy to security to fraud to past failures. Facebook's motto is "move fast and break things". The unspoken coda is "...and then meaninglessly apologize afterward while changing nothing."
Zuckerberg is not stupid. He realizes this Congress can't get its act together to pass a budget. The odds of them agreeing on "regulation" as he vaguely called for are zero. However, while Congress flails and gets distracted by... oh, I dunno, the likelihood that our Dear Leader will get us into an actual shooting war with Russia or North Korea or maybe actually find a stooge willing to fire the person who appears to have both the authority and integrity to uncover how a corrupt foreign-funded group of oligarchs hijacked our government, Zuck gets to shrug and say "Well, you saw I called for regulation; we're waiting on the government now."
Mr not-stupid Zuckerberg also controls the vast majority of actual working Facebook stock. Even if he steps back from day-to-day running of the company (which I expect him to do; it'll look good and free him up to do more of what he actually wants to do) he'd still be the real power there and have things his way no matter who sat in the hot seat.
The only downside for him I see from this is that he's likely going to have to quash or at least long-term backburner his political ambitions. If you're reading this and going "what?" then stop and think. Imagine that you controlled a tool some third party kind of ineptly used to get a really stupid and venal version of Forrest Gump elected president of the US. You, however, have access to ALL of it. You know how it works, and can instruct the engineers to make it work better for you. You are not stupid and if you think you couldn't do a better job of this than DJT you're just not thinking big enough. Zuck has the ego, the money, and the means. Right now, though, his name is mud. Let's see how well this prediction has aged in 2028 when (if I've done my math right) he will be 44 years old. Obama was 47 when he won his first presidential election.
Now let's turn to the question of what Facebook actually could do, if it wanted to solve this problem. It's completely capable of doing so - don't let anyone tell you otherwise - but it would require significant revamp to its business model. The short form is that FB (and Twitter and pick-your-favorite-social-platform) could operate as a fiduciary. That is, an entity that holds something of value (property, power, information) in a relationship of trust for another.
Facebook would become a fiduciary for our personal information and by default that information would not be released to others, but would be released for specific purposes to known individuals and named groups. The only widespread model we have in America right now for information fiduciaries is HIPAA. This regime enables for-profit entities to hold sensitive data (medical records, mostly), use that data for its own benefit and, under agreements, share those data with authorized other for-profit entities. Medical privacy isn't perfect, nor do I think HIPAA is without flaws. But right now it's the best model we have. Banking secrecy laws such as you find in places like Switzerland might also be another fiduciary model but I'm not as familiar with them.
Imagine, then, that Facebook was your personal info fiduciary. There would be language saying that accepting a "friend" request included permission for FB to share some data with that person, etc. You'd sign forms (much as you do when you visit a doctor) that allowed Facebook to share data with partners. FB would profit because other companies would pay to become such authorized partners and get access to those juicy data. Probably not enough, though. I imagine that FB would end up being some kind of "freemium" business model, where you'd pay something like $5-10/month to have Facebook act as your fiduciary. You pay fees to your bank and broker now - larger fees than that much of the time. If you didn't want to pay, Facebook could behave as it does now, sharing your data freely with whoever.
Changing over to this kind of model wouldn't be easy, or cheap, but I think it would be much more profitable for the company in the long run. There's also a significant first-mover advantage to be had here. Setting the standards for this kind of thing - even if they were released freely for others to implement - would be an advantage. You'd have the first and best implementation.
This also doesn't require any sort of government regulation. HIPAA and many fiscal fiduciaries work with the backing of the government, which is good but not a requirement. Third-party services such as auditing companies and insurance companies exist to spread risk around and increase trust. Financial markets are largely self-regulated, with only some oversight from the Feds, and do just fine. Facebook could contract with existing companies to have audits of its (hypothetical future) fiduciary practices, and to have insurance against breaches. Of course those business expenses would get passed on to the users and partners; Facebook would have to lay out money up front but would expect to make it back once the system was fully up and running.
In summary, there's nothing stopping Facebook from solving this problem today - certainly the lack of government "regulation" is at best a red herring. Neither they, nor Zuckerberg personally, are in any sort of difficult situation. Solving this problem could be quite profitable, which leads me to think that either they're working on this kind of a solution or more likely they just don't care and will sit back and let the money continue to roll in while ignoring all the shit they've broken because there's no actual consequence to breaking such things.
Someone posited that Zuckerberg was "between a rock and a hard place" in his testimony on Capitol Hill. That's completely false. The only hard thing about that appearance was that the Congresscritters bounced over a wide array of topics, from the Cambridge Analytica data breach to privacy to security to fraud to past failures. Facebook's motto is "move fast and break things". The unspoken coda is "...and then meaninglessly apologize afterward while changing nothing."
Zuckerberg is not stupid. He realizes this Congress can't get its act together to pass a budget. The odds of them agreeing on "regulation" as he vaguely called for are zero. However, while Congress flails and gets distracted by... oh, I dunno, the likelihood that our Dear Leader will get us into an actual shooting war with Russia or North Korea or maybe actually find a stooge willing to fire the person who appears to have both the authority and integrity to uncover how a corrupt foreign-funded group of oligarchs hijacked our government, Zuck gets to shrug and say "Well, you saw I called for regulation; we're waiting on the government now."
Mr not-stupid Zuckerberg also controls the vast majority of actual working Facebook stock. Even if he steps back from day-to-day running of the company (which I expect him to do; it'll look good and free him up to do more of what he actually wants to do) he'd still be the real power there and have things his way no matter who sat in the hot seat.
The only downside for him I see from this is that he's likely going to have to quash or at least long-term backburner his political ambitions. If you're reading this and going "what?" then stop and think. Imagine that you controlled a tool some third party kind of ineptly used to get a really stupid and venal version of Forrest Gump elected president of the US. You, however, have access to ALL of it. You know how it works, and can instruct the engineers to make it work better for you. You are not stupid and if you think you couldn't do a better job of this than DJT you're just not thinking big enough. Zuck has the ego, the money, and the means. Right now, though, his name is mud. Let's see how well this prediction has aged in 2028 when (if I've done my math right) he will be 44 years old. Obama was 47 when he won his first presidential election.
Now let's turn to the question of what Facebook actually could do, if it wanted to solve this problem. It's completely capable of doing so - don't let anyone tell you otherwise - but it would require significant revamp to its business model. The short form is that FB (and Twitter and pick-your-favorite-social-platform) could operate as a fiduciary. That is, an entity that holds something of value (property, power, information) in a relationship of trust for another.
Facebook would become a fiduciary for our personal information and by default that information would not be released to others, but would be released for specific purposes to known individuals and named groups. The only widespread model we have in America right now for information fiduciaries is HIPAA. This regime enables for-profit entities to hold sensitive data (medical records, mostly), use that data for its own benefit and, under agreements, share those data with authorized other for-profit entities. Medical privacy isn't perfect, nor do I think HIPAA is without flaws. But right now it's the best model we have. Banking secrecy laws such as you find in places like Switzerland might also be another fiduciary model but I'm not as familiar with them.
Imagine, then, that Facebook was your personal info fiduciary. There would be language saying that accepting a "friend" request included permission for FB to share some data with that person, etc. You'd sign forms (much as you do when you visit a doctor) that allowed Facebook to share data with partners. FB would profit because other companies would pay to become such authorized partners and get access to those juicy data. Probably not enough, though. I imagine that FB would end up being some kind of "freemium" business model, where you'd pay something like $5-10/month to have Facebook act as your fiduciary. You pay fees to your bank and broker now - larger fees than that much of the time. If you didn't want to pay, Facebook could behave as it does now, sharing your data freely with whoever.
Changing over to this kind of model wouldn't be easy, or cheap, but I think it would be much more profitable for the company in the long run. There's also a significant first-mover advantage to be had here. Setting the standards for this kind of thing - even if they were released freely for others to implement - would be an advantage. You'd have the first and best implementation.
This also doesn't require any sort of government regulation. HIPAA and many fiscal fiduciaries work with the backing of the government, which is good but not a requirement. Third-party services such as auditing companies and insurance companies exist to spread risk around and increase trust. Financial markets are largely self-regulated, with only some oversight from the Feds, and do just fine. Facebook could contract with existing companies to have audits of its (hypothetical future) fiduciary practices, and to have insurance against breaches. Of course those business expenses would get passed on to the users and partners; Facebook would have to lay out money up front but would expect to make it back once the system was fully up and running.
In summary, there's nothing stopping Facebook from solving this problem today - certainly the lack of government "regulation" is at best a red herring. Neither they, nor Zuckerberg personally, are in any sort of difficult situation. Solving this problem could be quite profitable, which leads me to think that either they're working on this kind of a solution or more likely they just don't care and will sit back and let the money continue to roll in while ignoring all the shit they've broken because there's no actual consequence to breaking such things.