Announcement

Collapse
No announcement yet.

Facebook ‘allowed’ Obama campaign to mine data

Collapse
  • Filter
  • Time
  • Show
Clear All
new posts

  • Facebook ‘allowed’ Obama campaign to mine data

    ‘They were on our side’: Facebook ‘allowed’ Obama campaign to mine data

    As Facebook faces public anger over Cambridge Analytica harvesting personal information for the Trump campaign, it’s been revealed that the social media giant allowed Barack Obama to do the same in 2012.
    Carol Davidsen, former director for media analytics for Obama’s 2012 campaign, has poured oil onto the fire by reveling in a series of tweets that Facebook allowed them to do “things they wouldn’t have allowed someone else to do.”

    That reportedly included “suck[ing] out the entire social graph” – an individual’s network of friends on Facebook – in a bid to target more and more potential voters through friends’ friends on social media.

    After Facebook “realized” what the Obama campaign staffers had been doing, they preferred to turn a blind eye for one simple reason: “they were on our side,” Davidsen claimed.

    Davidsen tweeted a link to a Time article, written in 2012, shedding some light on the Obama campaign's Facebook targeting campaign, which according to her was codenamed “Project Taargus.”

    “…the more than 1 million Obama backers who signed up for the [Facebook-based app] gave the campaign permission to look at their Facebook friend lists. In an instant, the campaign had a way to see the hidden young voters. Roughly 85% of those without a listed phone number could be found in the uploaded friend lists. What’s more, Facebook offered an ideal way to reach them. ‘People don’t trust campaigns. They don’t even trust media organizations,’ says Goff [Teddy Goff, the Obama campaign’s former digital director]. ‘Who do they trust? Their friends.’”

    Davidsen also shared a link to a talk from 2015, in which she recalled how Facebook’s privacy policies in 2012 helped the Obama team win an army of supporters.

    “The privacy policies at that time on Facebook were – if they opted in, they could tell us who all their friends were. So, they told us who all their friends were. We were actually able to ingest the entire social network of the US that’s on Facebook, which is most people.”

    But that was back then. Now, the more privacy-concerned public have sent the shares of Facebook tumbling following reports in the New York Times and the Observer that data analytics firm Cambridge Analytica, which worked for Donald Trump’s 2016 election team, harvested private information from more than 50 million Facebook users.

    Facebook said it was suspending Cambridge Analytica after finding data privacy policies had been violated.

    NSA whistleblower Edward Snowden tweeted Saturday that Facebook is a “surveillance company” that sells its users’ personal details.

    “Businesses that make money by collecting and selling detailed records of private lives were once plainly described as ‘surveillance companies,’” the former National Security Agency contractor wrote. “Their rebranding as ‘social media’ is the most successful deception since the Department of War became the Department of Defense,” he added.

    https://www.rt.com/usa/421808-obama-facebook-mine-data/

  • #2
    'Utterly horrifying': ex-Facebook insider says covert data harvesting was routine

    Sandy Parakilas says numerous companies deployed these techniques – likely affecting hundreds of millions of users – and that Facebook looked the other way

    Hundreds of millions of Facebook users are likely to have had their private information harvested by companies that exploited the same terms as the firm that collected data and passed it on to Cambridge Analytica, according to a new whistleblower.

    Sandy Parakilas, the platform operations manager at Facebook responsible for policing data breaches by third-party software developers between 2011 and 2012, told the Guardian he warned senior executives at the company that its lax approach to data protection risked a major breach.

    “My concerns were that all of the data that left Facebook servers to developers could not be monitored by Facebook, so we had no idea what developers were doing with the data,” he said.

    Parakilas said Facebook had terms of service and settings that “people didn’t read or understand” and the company did not use its enforcement mechanisms, including audits of external developers, to ensure data was not being misused.

    Parakilas, whose job it was to investigate data breaches by developers similar to the one later suspected of Global Science Research, which harvested tens of millions of Facebook profiles and provided the data to Cambridge Analytica, said the slew of recent disclosures had left him disappointed with his superiors for not heeding his warnings.

    “It has been painful watching,” he said. “Because I know that they could have prevented it.”

    Asked what kind of control Facebook had over the data given to outside developers, he replied: “Zero. Absolutely none. Once the data left Facebook servers there was not any control, and there was no insight into what was going on.”

    Parakilas said he “always assumed there was something of a black market” for Facebook data that had been passed to external developers. However, he said that when he told other executives the company should proactively “audit developers directly and see what’s going on with the data” he was discouraged from the approach.

    He said one Facebook executive advised him against looking too deeply at how the data was being used, warning him: “Do you really want to see what you’ll find?” Parakilas said he interpreted the comment to mean that “Facebook was in a stronger legal position if it didn’t know about the abuse that was happening”.

    He added: “They felt that it was better not to know. I found that utterly shocking and horrifying.”

    Parakilas first went public with his concerns about privacy at Facebook four months ago, but his direct experience policing Facebook data given to third parties throws new light on revelations over how such data was obtained by Cambridge Analytica.

    Facebook did not respond to a request for comment on the information supplied by Parakilas, but directed the Guardian to a November 2017 blogpost in which the company defended its data sharing practices, which it said had “significantly improved” over the last five years.

    “While it’s fair to criticise how we enforced our developer policies more than five years ago, it’s untrue to suggest we didn’t or don’t care about privacy,” that statement said. “The facts tell a different story.”

    Comment


    • #3
      ‘A majority of Facebook users’

      Parakilas, 38, who now works as a product manager for Uber, is particularly critical of Facebook’s previous policy of allowing developers to access the personal data of friends of people who used apps on the platform, without the knowledge or express consent of those friends.

      That feature, called Friends Permission, was a boon to outside software developers who, from 2007 onwards, were given permission by Facebook to build quizzes and games – like the widely popular FarmVille – that were hosted on the platform.

      The apps proliferated on Facebook in the years leading up to the company’s 2012 initial public offering, an era when most users were still accessing the platform via laptops and computers rather than smartphones.

      Facebook took a 30% cut of payments made through apps, but in return enabled their creators to have access to Facebook user data.

      Parakilas does not know how many companies sought Friends Permission data before such access was terminated around mid-2014. However, he said he believes tens or maybe even hundreds of thousands of developers may have done

      Parakilas estimates that “a majority of Facebook users” could have had their data harvested by app developers without their knowledge. The company now has stricter protocols around the degree of access third parties have to data.

      Parakilas said that when he worked at Facebook it failed to take full advantage of its enforcement mechanisms, such as a clause that enables the social media giant to audit external developers who misuse its data.

      Legal action against rogue developers or moves to ban them from Facebook were “extremely rare”, he said, adding: “In the time I was there, I didn’t see them conduct a single audit of a developer’s systems.”

      Facebook announced on Monday that it had hired a digital forensics firm to conduct an audit of Cambridge Analytica. The decision comes more than two years after Facebook was made aware of the reported data breach.

      During the time he was at Facebook, Parakilas said the company was keen to encourage more developers to build apps for its platform and “one of the main ways to get developers interested in building apps was through offering them access to this data”. Shortly after arriving at the company’s Silicon Valley headquarters he was told that any decision to ban an app required the personal approval of the chief executive, Mark Zuckerberg, although the policy was later relaxed to make it easier to deal with rogue developers.

      While the previous policy of giving developers access to Facebook users’ friends’ data was sanctioned in the small print in Facebook’s terms and conditions, and users could block such data sharing by changing their settings, Parakilas said he believed the policy was problematic.

      “It was well understood in the company that that presented a risk,” he said. “Facebook was giving data of people who had not authorised the app themselves, and was relying on terms of service and settings that people didn’t read or understand.”

      It was this feature that was exploited by Global Science Research, and the data provided to Cambridge Analytica in 2014. GSR was run by the Cambridge University psychologist Aleksandr Kogan, who built an app that was a personality test for Facebook users.

      The test automatically downloaded the data of friends of people who took the quiz, ostensibly for academic purposes. Cambridge Analytica has denied knowing the data was obtained improperly and Kogan maintains he did nothing illegal and had a “close working relationship” with Facebook.

      While Kogan’s app only attracted around 270,000 users (most of whom were paid to take the quiz), the company was then able to exploit the Friends Permission feature to quickly amass data pertaining to more than 50 million Facebook users.

      “Kogan’s app was one of the very last to have access to friend permissions,” Parakilas said, adding that many other similar apps had been harvesting similar quan****** of data for years for commercial purposes. Academic research from 2010, based on an analysis of 1,800 Facebooks apps, concluded that around 11% of third-party developers requested data belonging to friends of users.

      If those figures were extrapolated, tens of thousands of apps, if not more, were likely to have systematically culled “private and personally identifiable” data belonging to hundreds of millions of users, Parakilas said.

      The ease with which it was possible for anyone with relatively basic coding skills to create apps and start trawling for data was a particular concern, he added.

      Parakilas said he was unsure why Facebook stopped allowing developers to access friends data around mid-2014, roughly two years after he left the company. However, he said he believed one reason may have been that Facebook executives were becoming aware that some of the largest apps were acquiring enormous troves of valuable data.

      He recalled conversations with executives who were nervous about the commercial value of data being passed to other companies.

      “They were worried that the large app developers were building their own social graphs, meaning they could see all the connections between these people,” he said. “They were worried that they were going to build their own social networks.”

      Comment


      • #4
        ‘They treated it like a PR exercise’

        Parakilas said he lobbied internally at Facebook for “a more rigorous approach” to enforcing data protection, but was offered little support. His warnings included a PowerPoint presentation he said he delivered to senior executives in mid-2012 “that included a map of the vulnerabilities for user data on Facebook’s platform”.

        “I included the protective measures that we had tried to put in place, where we were exposed, and the kinds of bad actors who might do malicious things with the data,” he said. “On the list of bad actors I included foreign state actors and data brokers.”

        Frustrated at the lack of action, Parakilas left Facebook in late 2012. “I didn’t feel that the company treated my concerns seriously. I didn’t speak out publicly for years out of self-interest, to be frank.”

        That changed, Parakilas said, when he heard the congressional testimony given by Facebook lawyers to Senate and House investigators in late 2017 about Russia’s attempt to sway the presidential election. “They treated it like a PR exercise,” he said. “They seemed to be entirely focused on limiting their liability and exposure rather than helping the country address a national security issue.”

        It was at that point that Parakilas decided to go public with his concerns, writing an opinion article in the New York Times that said Facebook could not be trusted to regulate itself. Since then, Parakilas has become an adviser to the Center for Humane Technology, which is run by Tristan Harris, a former Google employee turned whistleblower on the industry.

        https://www.theguardian.com/news/201...andy-parakilas

        Comment


        • #5
          Clearly some new era sh^t with the info involved, but same ole same ole political sh^t with companies helping certain politicians.

          The system is corrupt as f#ck my brothers & sissies.

          Comment


          • #6
            It's quite interesting that everything the Democrats accuse Trump of doing are things they themselves did.

            Comment


            • #7
              and of course the media never called out obama on it...

              TYPICAL.

              its crazy how much the legacy media minus fox treated obama with such kid gloves as he set fires at home and abroad.

              and this whole cambridge analytica issue is just another in a long line of fake news designed to make everyhing about the trump administration look bad and illegal.

              no law was broken by cambridge and as 1bad pointed out, done by dems.

              obama was lauded by the press as advanced, ahead of its time, strategic brilliance etc... for basically doing the same thing trump did....

              Comment


              • #8
                Originally posted by Sterling Archer View Post
                and of course the media never called out obama on it...

                TYPICAL.

                its crazy how much the legacy media minus fox treated obama with such kid gloves as he set fires at home and abroad.

                and this whole cambridge analytica issue is just another in a long line of fake news designed to make everyhing about the trump administration look bad and illegal.

                no law was broken by cambridge and as 1bad pointed out, done by dems.

                obama was lauded by the press as advanced, ahead of its time, strategic brilliance etc... for basically doing the same thing trump did....
                LMFAO! Obama did it-real. Trump did it-fake news. Yet another thing Trump copied from Obama.

                Comment


                • #9
                  Originally posted by The Big Dunn View Post
                  LMFAO! Obama did it-real. Trump did it-fake news. Yet another thing Trump copied from Obama.
                  the fake news is about cambridge analytica being some kind of scandal when obama and the dems started the data mining and targeting through social media not that obama did it being real and trump doing it being fake.

                  stop being obtuse.

                  Comment


                  • #10
                    Originally posted by Sterling Archer View Post
                    the fake news is about cambridge analytica being some kind of scandal when obama and the dems started the data mining and targeting through social media not that obama did it being real and trump doing it being fake.

                    stop being obtuse.
                    You'll love this. And as usual, the press said absolutely nothing back in 2013 when this came out:

                    "The President has put in place an organization with the kind of database that no one has ever seen before in life," Representative Maxine Waters told Roland Martin on Monday. "That's going to be very, very powerful," Waters said. "That database will have information about everything on every individual on ways that it's never been done before and whoever runs for President on the Democratic ticket has to deal with that. They're going to go down with that database and the concerns of those people because they can't get around it. And he's [President Obama] been very smart. It's very powerful what he's leaving in place."

                    Comment

                    Working...
                    X
                    TOP