Programming & IT Tricks . Theme images by MichaelJay. Powered by Blogger.

Copyright

Facebook

Post Top Ad

Search This Blog

Post Top Ad

Responsive Ads Here

Archive

Post Top Ad

Contact


Editors Picks

Follow us

Post Top Ad

Fashion

Music

News

Sports

Food

Technology

Featured

Videos

Fashion

Technology

Fashion

Label

Translate

About

Translate

Sponsor

test

Weekly

Comments

Recent

Connect With us

Over 600,000+ Readers Get fresh content from FastBlog

About

Showing posts with label facebook. Show all posts
Showing posts with label facebook. Show all posts

Saturday, January 20, 2018

The Lies Facebook Tells Itself (and Us)


Mark Zuckerberg on a tractor in Blanchardville, Wis., in April
Mark Zuckerberg informed us a few days ago that he would be rewiring our information landscape. Posts from friends and family move up in the rankings; news and media fall off. He made a few oblique references as to why and assured us in an insipid 533-word blog post that the changes would mean that the 50 minutes users spend on the platform each day would be “time well spent.”
Anyone who has been even partially sentient over the past few years has noticed how we have become shrouded in our filter bubbles, secure like never before in the complacency of our convictions. This certainty in the righteousness of our own point of view makes us regard a neighbor with a yard sign the way a Capulet regards a Montague. It seems to me that we suddenly hate each other a whole lot more than we ever did before.
So it should come as no surprise that the place where filter bubbles are the thickest, where the self-satisfied certitude that comes from unchecked power is iron-clad, is at the headquarters of Facebook itself. This was brought home to me when I read an interview with the head of Facebook’s News Feed product, Adam Mosseri, by the savvy tech blogger Ben Thompson.
Mosseri, who has been at Facebook for nearly a decade (eons in Facebook chronos), was eager to explain to an interviewer why this change was rational, normal, good for humanity (the company counts one quarter of humanity as monthly active users). The interview was quite a get for Thompson, and he published it in near-verbatim format. In so doing, he laid bare just how removed from the rest of humanity Facebook management is, and how blissfully ignorant they are about the consequences of their actions.
I refined my outrage into five points Mosseri makes (down from 15 initially) that illustrate the degree to which Facebook executives live in a world of their own making where the rest of us are expected to comply.

#1 The changes are for our collective “well-being”

The most glaring assumption that jumps out of this interview (as well as official Facebook communiques) is that we are all asked to swallow Facebook’s incredibly vague gauge of “well-being,” or “meaningful social interaction.” In fact, these terms are sometimes tossed about interchangeably. (Zuckerberg uses “well-being” three times in his post.)
Excerpt from interview on Stratechery.com.
In the excerpt above, Mosseri implies that Facebook is doing this for our own mental health, and that it’s based on extensive research. Interactions = good. Passively consuming content = bad.
Aside from the disturbingly paternalistic assumptions therein, can I ask how Facebook defines well-being? And, since they have done such extensive research, can they share it with the public transparently? Mosseri’s answer: “We’ll certainly consider it…” (Facebook has a blog post that discusses a few of its conclusions here.)
To me, this strikes at the heart of the peril posed by Facebook: The platform has probably more power than any company has ever wielded over information (and perhaps even our well-being). And yet it engages in zero public debate about the changes it makes. It simply rolls them out. We are asked to buy Facebook’s version of meaningful, as in this Mosseri statement: “So if you and I had a back and forth conversation on a post from a Page, that would actually count as a meaningful social interaction.” Hence, it would get a higher rank in the algorithm, etc.
Is an exchange “meaningful”? I can think of plenty of Facebook exchanges that merely raised my blood pressure. These are sweeping categories. Facebook has placed itself as the imperious custodian of our well-being, but tells us nothing about how it cares for us. And do they care if it has side effects? Just ask independent journalists in Bolivia what happens when Facebook starts using them as guinea pigs in an experiment about their well-being: Their audience drops, the government’s ability to control public opinion increases. And when they complain to Facebook, they get an automated reply email.

#2 “This change actually has very little to do with false news…”

Mosseri actually said that. But that’s not as stunning as what came next: “I will say that the amount of attention on false news specifically and a number of other integrity issues, certainly caught us off guard in a number of ways and it’s certainly been something we’ve tried to respond responsibly [to].”
Let’s unpack this. For more than a year, Facebook has been under scrutiny because there has been a flood of outright fake and misleading “news” coursing through its pipes. As studies have shown, people share fake news on Facebook, often more than the real stuff. The Pope endorsed Donald Trump? That spreads on Facebook. People get pissed. When the senior leadership at Facebook says this caught them “off guard” I have to pick my jaw up off the floor. Inside the Facebook HQ, the filter bubble is thicker than a security blanket. They really believe that all they are doing is connecting people and fostering “meaningful interactions.” They are not playing Russian roulette with our democratic institutions or selling adds to people who want to burn Jews.
And this filter bubble is so impenetrable that they believe one minute that they have the power to manipulate our mood (they do) and are shocked the next when they get blowback for allowing people to manipulate our politics.
Then the last part: it’s “something we’ve tried to respond responsibly [to].” No, Facebook, you have not. The only responsible response after these revelations would be a massive overhaul of your system and a transparent conversation with the public and Congress about how your algorithm works. You have produced the information equivalent of a massive e.coli contamination. Instead, your response has been an under-funded effort to infuse fact-checking into the News Feed, and a 41% uptick in what you pay your lobbyists.

#3 “Does the scrutiny accelerate the process? It’s really hard to say.”

Yes, it does and no, it’s not. This statement is in response to Thompson’s question about the criticism Facebook has received in the past year over its distribution of fake and misleading news and whether that has prompted the company to assume greater responsibility over what its users see. Mosseri’s full response is here:
Excerpt from interview on Stratechery.com
Here’s another counterfactual: Do you think the revelations about years of sexual abuse, assault and downright rape in the workplace by powerful men (Harvey Weinstein, Matt Lauer, Charlie Rose, etc., etc.) have accelerated the conversation about women’s rights and equity in the workplace? I mean, it’s possible.
So let’s assume that Facebook continues to post $4.7 billion in net income each quarter and its stock rises another 40% percent over the next 12 months (market cap at this writing is $517 billion), and there is no public criticism about fake news, targeting voters, and so forth. Absent any external pressure, do you think that Zuckerberg and the rest of the boys in senior management (and Sheryl Sandberg) take it upon themselves to head to a sweat lodge to probe their souls about whether the way they are redrawing the map of our information economy is good for humanity? Sure, that’s likely.

#4 Does Facebook have any responsibility toward media companies?

It’s a great question posed by Thompson. And the answer confirms my worst fears.
Mosseri’s initial response is anodyne enough: “I think we have a number of responsibilities.” News stories are important to people, he says. But then, just as quickly, he contorts himself into a pretzel to explain why it’s also not the case: “…news is a minority of the media content on Facebook, and media is a minority of the overall content in the News Feed.” Ergo, it’s not that big of a responsibility.
Two major fallacies here. The first: If there is less quantity, then there is less importance. My five-year-old niece’s recent birthday was a big hit on Facebook, as I imagine many other birthdays were that day. So, that’s more important to the Facebook community (read: humanity) than the SNAFU alert sent to all the residents of Hawaii warning of an imminent missile attack? The numbers tell us it is.
The second: Reporting, writing and editing a news story of any import takes time, resources and skill. Hence, there will be many fewer of them than there are birthday posts. So if it’s a numbers game, news loses. This is what I’d call self-serving math.

#5 “… there’s understandably going to be a lot of anxiety…”

Here’s some more math: The Pew Research Center reports that 45% of Americans get news from Facebook, a percentage that has been increasing sharply. Why? Because that’s the product Facebook created. It designed itself for that.
As the algorithm tweaks fall into place, and news publishers stand by as their audience plummets, Mosseri concedes: “there’s understandably going to be a lot of anxiety … it’s always a set of trade offs, we do the best we can with the information at hand.” (You possess ALL the information, by the way.) These are not words of someone who sees the news media as partners but as pawns. A post is a post is a post.
But that’s not how this company has operated. Since it burst on the scene, not all that many years ago, it has dangled carrot after carrot in front of news media. Do your headlines this way and you’ll be rewarded. Hey, pivot to video! No, try our Instant Articles product (or else). And then, like Lucy yanking the football, it’s gone. Facebook has moved on.
The heart of the issue is that Facebook wields immense power and is subject to minimal accountability. Changes come when Zuckerberg decrees them. Yes, it’s a publicly traded company. Yes, Congress shall make no law … But the power is real and the accountability is not.
And with all this heft, and all this research, Facebook seems to understand so little about the news it serves up. Take for example this notion that commenting or reacting to news is what makes news valuable. Yes, that’s true some of the time, but it’s also false some of the time. Sometimes we read the news to be informed. To catch up. To be better citizens. Because I didn’t share or like an article about climate change doesn’t mean that I don’t care about climate change.
To treat the value of news purely through the lens of whether people have shared it or had “meaningful interactions” with other members of the Facebook “community” misses the value entirely.
And Dear Facebook, sharing and commenting on every piece of news is actually part of the problem: It is what has thrust news and journalism into this hyper-partisan shithole we’re in right now.
I only have one wish for Zuckerberg. In a few short years, he will be the father of a girl in her tweens. I can only assume that she, too, might become obsessed with the Instagram posts of her friends, whether they liked her pic, or that she might discover that everyone is hanging out without her. And it might drive her to tears. And then her wise parents will decide (unilaterally) that they need to limit her screen time to 30 minutes. It’s for her own well-being, after all.

Tuesday, January 16, 2018

How to catch a criminal using only milliseconds of audio


Scientists can tell far more from your recorded voice than you might think. Image: Pixabay
Simon Brandon, Freelance journalist

A prankster who made repeated hoax distress calls to the US Coast Guard over the course of 2014 probably thought they were untouchable. They left no fingerprints or DNA evidence behind, and made sure their calls were too brief to allow investigators to triangulate their location.
Unfortunately for this hoaxer, however, voice analysis powered by AI is now so advanced that it can reveal far more about you than a mere fingerprint. By using powerful technology to analyse recorded speech, scientists today can make confident predictions about everything from the speaker’s physical characteristics — their height, weight, facial structure and age, for example — to their socioeconomic background, level of income and even the state of their physical and mental health.
One of the leading scientists in this field is Rita Singh of Carnegie Mellon University’s Language Technologies Institute. When the US Coast Guard sent her recordings of the 2014 hoax calls, Singh had already been working in voice recognition for 20 years. “They said, ‘Tell us what you can’,” she told the Women in Tech Show podcast earlier this year. “That’s when I started looking beyond the signal. How much could I tell the Coast Guard about this person?”
Rita Singh is an expert in speech recognition
What your voice says about you
The techniques developed by Singh and her colleagues at Carnegie Mellon analyse and compare tiny differences, imperceptible to the human ear, in how individuals articulate speech. They then break recorded speech down into tiny snippets of audio, milliseconds in duration, and use AI techniques to comb through these snippets looking for unique identifiers.
Your voice can give away plenty of environmental information, too. For example, the technology can guess the size of the room in which someone is speaking, whether it has windows and even what its walls are made of. Even more impressively, perhaps, the AI can detect signatures left in the recording by fluctuations in the local electrical grid, and can then match these to specific databases to give a very good idea of the caller’s physical location and the exact time of day they picked up the phone.
This all applies to a lot more than hoax calls, of course. Federal criminal cases from harassment to child abuse have been helped by this relatively recent technology. “Perpetrators in voice-based cases have been found, have confessed, and their confessions have largely corroborated our analyses,” says Singh.
Portraits in 3D
And they’re just getting started: Singh and her fellow researchers are developing new technologies that can provide the police with a 3D visual portrait of a suspect, based only on a voice recording. “Audio can us give a facial sketch of a speaker, as well as their height, weight, race, age and level of intoxication,” she says.
But there’s some way to go before voice-based profiling technology of this kind becomes viable in a court. Singh explains: “In terms of admissibility, there will be questions. We’re kind of where we were with DNA in 1987, when the first DNA-based conviction took place in the United States.”
This has all proved to be bad news for the Coast Guard’s unsuspecting hoaxer. Making prank calls to emergency services in the US is regarded as a federal crime, punishable by hefty fines and several years of jail time; and usually the calls themselves are the only evidence available. Singh was able to produce a profile that helped the Coast Guard to eliminate false leads and identify a suspect, who they hope to bring a prosecution soon.
Given the current exponential rate of technological advancement, it’s safe to say this technology will become much more widely used by law enforcement in the future. And for any potential hoax callers reading this: it’s probably best to stick to the old cut-out newsprint and glue method for now. Just don’t leave any fingerprints.
Have you read?

Saturday, January 13, 2018

Facebook’s newsfeed changes: a disaster or an opportunity for news publishers?


Social media and digital executives in newsrooms already have a tough job connecting their content to consumers via social media, but Facebook’s proposed changes in the algorithms of its ‘newsfeed’ are going to make it a lot harder. Social networks offer immense opportunities for reaching vast new audiences and increasing the engagement of users with journalism. The most important platform in the world is about to make that more difficult.
Clearly, this is a blow for news publishers who have spent the last decade or so fighting a battle for survival in a world where people’s attention and advertising have shifted to other forms of content and away from news media brand’s own sites. They are clearly very concerned. Yet, could this be a wake-up call that will mean the better, most adaptive news brands benefit?
I’m not going to argue that this is good news for news publishers, but blind panic or cynical abuse of Facebook is not a sufficient response. The honest answer is that we don’t know exactly what the effect will be because Facebook, as usual, have not given out the detail and different newsrooms will be impacted differently.
It’s exactly the kind of issue we are looking at in our LSE Truth, Trust and Technology Commission. Our first consultation workshop with journalists, and related practitioners from sectors such as the platforms, is coming up in a few weeks. This issue matters not just for the news business. It is also central to the quality and accessibility of vital topical information for the public.
Here’s my first attempt to unpack some of the issues.
Mark Zuckerberg: making time on Facebook ‘well spent’
Firstly, this is not about us (journalists). Get real. Facebook is an advertising revenue generation machine. It is a public company that has a duty to maximise profits for its shareholders. It seeks people’s attention so that it can sell it to advertisers. It has a sideline in charging people to put their content on its platform, too. It is a social network, not a news-stand. It was set up to connect ‘friends’ not to inform people about current affairs. Journalism, even where shared on Facebook, is a relatively small part of its traffic.
Clearly, as Facebook has grown it has become a vital part of the global (and local) information infrastructure. Other digital intermediaries such as Google are vastly important, and other networks such as Twitter are significant. And never forget that there are some big places such as China where other similar networks dominate, not Facebook or other western companies. But in many countries and for many demographics, Facebook is the Internet, and the web is increasingly where people get their journalism. It’s a mixed and shifting picture but as the Reuters Digital News Report shows, Facebook is a critical source for news.
From Reuters Digital News Report 2017
If you read Zuckerberg’s statement he makes it clear that he is trying to make Facebook a more comfortable place to be:
“recently we’ve gotten feedback from our community that public content — posts from businesses, brands and media — is crowding out the personal moments that lead us to connect more with each other.”
His users are ‘telling him’ (i.e. fewer of them are spending less time on FB) what a plethora of recent studies and books have shown which is that using Facebook can make you miserable. News content — which is usually ‘bad’ news — doesn’t cheer people up. The angry, aggressive and divisive comment that often accompanies news content doesn’t help with the good vibes. And while the viral spread of so-called ‘fake news’ proves it is popular, it also contributes to the sense that Facebook is a place where you can’t trust the news content. Even when it is credible, it’s often designed to alarm and disturb. Not nice. And Facebook wants nice.
One response to this from journalists is despair and cynicism. The UK media analyst Adam Tinworth sums this approach up in a witty and pithy ‘translation’ of Zuckerberg’s statement:
“We can’t make money unless you keep telling us things about yourself that we can sell to advertisers. Please stop talking about news.”
Another accusation is that Facebook is making these changes because of the increasing costs it is expending at the behest of governments who are now demanding it does more to fight misinformation and offensive content. That might be a side-benefit for Facebook but I don’t think it’s a key factor. It might even be a good thing for credible news if the algorithmic changes include ways of promoting reliable content. But overall the big picture is that journalism is being de-prioritised in favour of fluffier stuff.
Even Jeff Jarvis, the US pioneer of digital journalism who has always sought to work with the grain of the platforms, admits that this is disturbing:
“I’m worried that news and media companies — convinced by Facebook (and in some cases by me) to put their content on Facebook or to pivot to video — will now see their fears about having the rug pulled out from under them realized and they will shrink back from taking journalism to the people where they are having their conversations because there is no money to be made there.”*
The Facebook changes are going to be particularly tough on news organisations that invested heavily in the ‘pivot to video’. These are often the ‘digital native’ news brands who don’t have the spread of outlets for their content that ‘legacy’ news organisations enjoy. The BBC has broadcast. The Financial Times has a newspaper. These organisations have gone ‘digital first’ but like the Economist they have a range of social media strategies. And many of them, like the New York Times, have built a subscription base. Email newsletters provide an increasingly effective by-pass for journalism to avoid the social media honey-trap. It all makes them less dependent on ‘organic’ reach through Facebook.
But Facebook will remain a major destination for news organisations to reach people. News media still needs to be part of that. As the ever-optimistic Jarvis also points out, if these changes mean that Facebook becomes a more civil place where people are more engaged, then journalism designed to fit in with that culture might thrive more:
“journalism and news clearly do have a place on Facebook. Many people learn what’s going on in the world in their conversations there and on the other social platforms. So we need to look how to create conversational news. The platforms need to help us make money that way. It’s good for everybody, especially for citizens.”
News organisations need to do more — not just because of Facebook but also on other platforms. People are increasingly turning to closed networks or channels such as Whatsapp. Again, it’s tough, but journalism needs to find new ways to be on those. I’ve written huge amounts over the last ten years urging news organisations to be more networked and to take advantage of the extraordinary connective, communicative power of platforms such as Facebook. There has been brilliant innovations by newsrooms over that period to go online, to be social and to design content to be discovered and shared through the new networks. But this latest change shows how the media environment continues to change in radical ways and so the journalism must also be reinvented.
Social media journalist Esra Dogramaci has written an excellent article on some of the detailed tactics that newsrooms can use to connect their content to users in the face of technological developments like Facebook’s algorithmic change:
“if you focus on building a relationship with your audience and developing loyalty, it doesn’t matter what the algorithm does. Your audience will seek you out, and return to you over and over again. That’s how you ‘beat’ Facebook.”
Journalism Must Change
The journalism must itself change. For example, it is clear that emotion is going to be an even bigger driver of attention on Facebook after these changes. The best journalism will continue to be factual and objective at its core — even when it is campaigning or personal. But as I have written before, a new kind of subjectivity can not only reach the hearts and minds of people on places like Facebook, but it can also build trust and understanding.
This latest change by Facebook is dramatic, but it is a response to what people ‘like’. There is a massive appetite for news — and not just because of Trump or Brexit. Demand for debate and information has never been greater or more important in people’s everyday lives. But we have to change the nature of journalism not just the distribution and discovery methods.
The media landscape is shifting to match people’s real media lives in our digital age. Another less noticed announcement from Facebook last week suggested they want to create an ecosystem for local personalised ‘news’. Facebook will use machine learning to surface news publisher content at a local level. It’s not clear how they will vet those publishers but clearly this is another opportunity for newsrooms to engage. Again, dependency on Facebook is problematic, to put it mildly, but ignoring this development is to ignore reality. The old model of a local newspaper for a local area doesn’t effectively match how citizens want their local news anymore.
What Facebook Must Do
Facebook has to pay attention to the needs of journalism and as it changes its algorithm to reduce the amount of ‘public content’ it has to work harder at prioritising quality news content. As the Guardian’s outstanding digital executive Chris Moran points out, there’s no indication from Facebook that they have factored this into the latest change:
Fighting ‘fake news’ is not just about blocking the bad stuff, it is ultimately best achieved by supporting the good content. How you do that is not a judgement Facebook can be expected or relied upon to do by itself. It needs to be much more transparent and collaborative with the news industry as it rolls out these changes in its products.
When something like Facebook gets this important to society, like any other public utility, it becomes in the public interest to make policy to maximise social benefits. This is why governments around the world are considering and even enacting legislation or regulation regarding the platforms, like Facebook. Much of this is focused on specific issues such as the spread of extremist or false and disruptive information.

Interested for our works and services?
Get more of our update !