* * *
🆘‼😂💦 #Germany: #Merkel's Justice Minister #HeikoMaas (SPD) and responsible for the censorship law #NetzDG tweeted in 2014: "Meeting with Turkish Minister of Justice: Blocking of Twitter & Facebook is not our understanding of freedom of expression." Haha! Exactly my humor! pic.twitter.com/TETZIp7AOn
— Onlinemagazin (@OnlineMagazin) January 7, 2018
* * *
Brian Gerrish and Mike Robinson with the last UK Column News of 2017. We will be back on Wednesday 3 January. We would like to wish our viewers a fantastic Christmas and New Year and we hope to see you in 2018.
START – House of Commons inquiry into online hate (of politicians?)…
08:39 – The Guardian: disgraceful ‘propaganda’ article
10:42 – Meet the Guardian journalist responsible: Olivia Solon
13:31 – How the Guardian propaganda piece is crafted
22:00 – Guardian: advertising for AI company who profit from misery
25:12 – Aircraft carrier HMS Queen Elizabeth is leaking…
29:15 – Aircraft: 14th F35 delivered…to the desert
30:14 – ‘Britain’s record on job creation is second to none…’
32:59 – Steel industry opportunities in Britain…
37:02 – Standing up for real journalism in 2017
England creates its very own police thought crime unit to patrol the internet. https://t.co/1a5utCmOth
— Alois Irlmaier (@AloisIrlmaier) December 17, 2017
For those interested, here is the full interview with Palihapitiya:
In the most ironic story of the day, and perhaps of 2017, a former Facebook executive whose job it was to literally get the world hooked on the “internet crack” that is social media, is now calling on people to take a “hard break” from a service which he believes is “ripping apart the social fabric of how society works.” Speaking to a group of students at the Stanford Graduate School of Business, Chamath Palihapitiya, who joined Facebook in 2007 and became its vice president for user growth, said that he feels “tremendous guilt” for his role in building the social media giant and warned that “if you feed the beast, that beast will destroy you…” (you can view the relevant portion of the interview here).
“I feel tremendous guilt.”
“I think we have created tools that are ripping apart the social fabric of how society works. That is truly where we are.”
“I would encourage all of you, as the future leaders of the world, to really internalize how important this is. If you feed the beast, that beast will destroy you. If you push back on it you have a chance to control it and reign it in.”
“There is a point in time when people need a hard break from some of these tools.”
“The short-term, dopamine-driven feedback loops we’ve created are destroying how society works. No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem — this is not about Russians ads. This is a global problem.”
“So, we’re in a really bad state of affairs right now, in my opinion. It is eroding the core foundations of how people behave by and between each other.”
“And, I don’t have a good solution. You know, my solution is I just don’t use these tools anymore. I haven’t for years. It’s created huge tension with my friends. Huge tensions in my social circles.”
As CNBC notes, Palihapitiya went on to describe an incident in India where hoax messages about kidnappings shared on WhatsApp led to the lynching of seven innocent people. “That’s what we’re dealing with,” said Palihapitiya. “And imagine taking that to the extreme, where bad actors can now manipulate large swathes of people to do anything you want. It’s just a really, really bad state of affairs.” As noted above, Palihapitiya went on to say that he’s only posted to Facebook a handful of times in the past 7 years and said that his children “aren’t allowed to use that s**t.”
A mere few years ago the idea that artificial intelligence (AI) might be used to analyze and report to law enforcement aberrant human behavior on social media and other online platforms was merely the far out premise of dystopian movies such as Minority Report, but now Facebook proudly brags that it will use AI to “save lives” based on behavior and thought pattern recognition.
What could go wrong?
The latest puff piece in Tech Crunch profiling the apparently innocuous sounding “roll out” of AI (as if a mere modest software update) “to detect suicidal posts before they’re reported” opens with the glowingly optimistic line, “This is software to save lives” – so who could possibly doubt such a wonderful and benign initiative which involves AI evaluating people’s mental health? Tech Crunch’s Josh Cronstine begins:
My personal Facebook account, which has the maximum 5,000 friends and an additional 5,000+ followers, has been blocked from posting for three days. My page hasn’t been blocked yet, but we’ll see; I shared the article there, too.
The reason given for this ban by the little pop-up boxes when I logged on just now was that a couple months ago I had shared an article about admitted false flag operations perpetrated by governments around the world. I don’t know what happened that made Facebook’s system decide to crack down on me now all of a sudden, but I do know I’ve been a bit naughtier than usual in my last couple of articles.
The article I got the banhammer for sharing is titled For Those Who Don’t ‘Believe’ In ‘Conspiracies’ Here Are 58 Admitted False Flag Attacks. According to the site’s ticker it has 50,667 shares as of this writing. It’s laden with hyperlinks for further reading, and lists only instances of false flag operations that insiders are on the record as having admitted to themselves. It’s a good compilation of important information. People should be allowed to share it.
The notifications say I can be permanently banned if I continue posting that sort of material. I’ve had that account since 2007.
* * *
38-year-old founding president of Facebook, Sean Parker, was uncharacteristically frank about his creation in an interview with Axios. So much so in fact that he concluded, Mark Zuckerberg will probably block his account after reading this.
Confirming every ‘big brother’ conspiracy there is about the social media giant, Parker explained how social networks purposely hook users and potentially hurt our brains…
“When Facebook was getting going, I had these people who would come up to me and they would say, ‘I’m not on social media.’ And I would say, ‘OK. You know, you will be.’ And then they would say, ‘No, no, no. I value my real-life interactions. I value the moment. I value presence. I value intimacy.’ And I would say, … ‘We’ll get you eventually.'”
“I don’t know if I really understood the consequences of what I was saying, because [of] the unintended consequences of a network when it grows to a billion or 2 billion people and … it literally changes your relationship with society, with each other … It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.”
“The thought process that went into building these applications, Facebook being the first of them, … was all about: ‘How do we consume as much of your time and conscious attention as possible?'”
“And that means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content, and that’s going to get you … more likes and comments.”
“It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology.”
A flaw in software used by Facebook to moderate offensive content exposed more than 1,000 employees to suspected terrorists online.
According to a report from The Guardian’s Olivia Solon Friday, the social network found that moderators in 22 departments had their personal profiles become visible to suspected extremists after the issue was discovered in late 2016.
- That major technology companies are openly stifling the free speech of people trying to counter jihad is bad enough; what is beyond unconscionable is that they simultaneously enable Islamic supremacists to spread the very content that the counter-jihadists have been exposing.
- According to the legal complaint, the names and symbols of Palestinian Arab terrorist groups and individuals were known to authorities, and “Facebook has the data and capability to cease providing services to [such] terrorists, but… has chosen not to do so.”
- A separate lawsuit claims that Twitter not only benefits indirectly by seeing its user base swell through the increase of ISIS-linked accounts, but directly profits by placing targeted advertisements on them.
- When jihadist content is permitted to spread unchecked across the globe via cyberspace, it is a matter of national and international security. Tragically for Western civilization, its tech and media icons have been colluding — even if unwittingly — with those working actively to destroy it.
For the past few years, large social media and other online companies have been seeking to restrict or even criminalize content that could be construed as critical of Islam or Muslims, including when the material simply exposes the words and actions of radical Islamists.
The recent attempt by the digital payment platform, PayPal, to forbid two conservative organizations — Jihad Watch and the American Freedom Defense Initiative — from continuing to use the service to receive donations, is a perfect case in point. Although PayPal reversed the ban, its initial move was part of an ongoing war against the free speech of counter-jihadists — those working to expose the ideology, goals, tactics and strategies of Islamic supremacists, and who are trying to defeat or at least to deter the Islamic supremacist global agenda.
A man who described himself as a ‘soldier of Allah’ escaped a terrorism charge because of his Facebook settings.
Leroy McCarthy, 22, was going by the name Abdullah Mahmood when he was arrested for a string of offensive posts.
In one, he wrote about bombing Furness General Hospital, in Cumbria.
But he was not charged with a terrorism offence due to his social media settings.
Lee Dacre, prosecuting, told Furness Magistrates’ Court: ‘He would have been here today on a terrorism charge but for the settings on his Facebook. It’s a legal technicality.’
* * *
Jihad Watch reported on this firing last week, but what Ross Lister actually said had not yet been revealed. Now we know: “the comment in question was on a post about the consequences for terrorists, former PC Lister wrote: ‘What? Wrap them in bacon?’”
According to Chief Constable Jerry Graham, this was beyond the pale: Lister had “damaged the reputation of the constabulary and undermined public confidence. PC Lister’s behaviour and judgement has fallen so far below the standards expected that I have decided to dismiss him from the constabulary without notice. The communities of Cumbria expect officers and staff to act on their behalf with integrity, fairness and judgement. The trust of local communities, particularly minority groups, is hard won and easily lost.”
By Paul Craig Roberts
Two of America’s most populous states, Texas and Florida, are in hurricane ruins, and Washington is fomenting more wars.
The US national debt is now over $20 trillion, and Washington is fomenting more wars.
The entire world is helping Washington foment wars – including two targeted countries themselves – Russia and China – both of which are helping Washington foment more wars. Believe it or not, both Russia and China voted with Washington on the UN Security Council to impose more and harsher sanctions on North Korea, a country guilty of nothing but a desire to have the means to protect itself from the US and not become yet another Washington victim like Afghanistan, Iraq, Libya, Somalia, Yemen, Syria, Serbia, and Ukraine overthrown in a US coup and now poverty-stricken.