Turkey is moving forward with a purchase of Russian SAM batteries, despite warnings from the U.S. and NATO that the integration of Russian defense technology is not consistent with NATO’s security requirements.
- How did we get here? Is this mostly due to President Erdogan, or is it evidence of a longer-term megatrend that may be more difficult to reverse?
- What would be the implications of lesser Turkish involvement in NATO (either by the Turkish government’s choice or NATOs)?
- How important is Turkey to the NATO “European shield”?
- What should be done (if anything)…
- By the U.S.?
- By other NATO allies (or NATO in its own right)?
I hope we’ll be joined by Bob Sargent, who has some experience with Turkey. All depends on Bob’s availability and Mike Wolf’s ability to get the “Owl.”
Here’s an opinion piece from the Ellsworth American to get you thinking…but feel free to Google and add more sources in the comments.
Continuing last week’s discussion on social media, let’s consider how new media (including social media) psychological research and new technology are changing the rules under which democracy operates.
By “new media” I mean both social media and online media that collects information about our behavior. When we read the NY Times online. The NYT comes to know what headlines we click on, what articles we skip, skim, study, and share. From surprisingly little information algorithms can build up an accurate picture of individuals.
By “manipulation” I mean a covert attempt to change our minds and behavior by exploiting our cognitive biases and flaws rather than by a transparent appeal to our values and our reason. Psychological research has identified what these biases and flaws are–and how to exploit them. (Article in Money)
By “new technology” I mean the growing ability of individuals to create media that appears to be real but is not, and which exploits cognitive biases to effectively manipulate others.
This is a big subject, and here’s an outline of the main points.
- We don’t know ourselves as well as we think we do. (Psychology Today, Scientific American)
- We are more easily manipulated than we believe we are. (
- The better an entity knows us, the more effectively we can be manipulated.
- New media (surprisingly) know us better than we know ourselves. (Google knows us better. The Internet knows us better)
- Research shows that the more often we hear a fact repeated, the more likely we are to believe it is true. (This is true if what we’re told is credible but we don’t know whether to believe it or not. It is also true even when we know it’s not true.) (See: “The illusory truth effect.”
- (If you think that you’re immune to that, you’re wrong! You may be less prone, but you are not immune.)
- Some new media content is provided by the media channel; some is provided by advertisers; for some media content is provided by subscribers.
- Polarizing content is more effective than neutral “objective” content. Some people will want more. Some people will reject it. But it is always more effective at producing action.
- Old media needed to create and present content to attract attention and engagement on the average. So the incentive of old media content providers was to produce content that might be polarizing, but not too polarizing, lest it drives away more people than it attracted.
- New media can choose to present content in order to maximize attention and engagement. Therefore new media content providers have an incentive to deliver as much polarizing content as they can to the people who will be attracted to it. This is as true for the New York Time as it is for Breitbart. And It’s just a question of how polarizing, and to whom.
- Political content providers get rapid feedback on the effectiveness of their content and can tune them to maximize response. In many cases, the most polarizing content will get the best response.
- But it doesn’t have to be polarizing to be effective. Facebook’s Ad Library lets you see how the Trump campaign is creating messages that narrowly target audiences. The ads presented to women are different than the ones presented to men. The ads that target Latinos are different than the ones targeted people who want to deport immigrants. The ads that target gun control advocates are different than the ones for people who would be opposed to gun control.
- The new technology means anyone can be a content creator and within social networks, anyone can be a content provider.
- “Seeing is believing” is no longer true. The newest new technology can disrupt our notion of what’s true and what’s not. See “Deepfake.” For example, consider this video. It’s overplayed so that you know it’s not really Zuckerburg. But what if it had not been so transparent? Read this article. See this video with deepfake examples.
The result is an environment that was not only unanticipated by our founding documents and surprising to many people today.
People have always been manipulated, but in the past they had more choice in deciding who they’d let influence and even manipulate them. A liberal might choose the New York Times. A conservative the Wall Street Journal. And then you’d see ads targeted to “your kind of people” intended to move you to action that you generally approved of.
But on Facebook, you don’t choose: political advertisers choose you based on how much they are willing to pay to get your attention. They’ll target you with messages that have been carefully crafted and iteratively tuned to change your mind in the direction as much as possible in the direction they want.
The more they repeat their message, the more your mind will change. It may be a slow process, but the research tells us it’s undeniable. If you think you are immune to such manipulation, then you’re kidding yourself.
With enough time and enough money, anyone can be manipulated.
This was not possible with newspapers and magazines or with books and lectures. But it is possible with the tools that are now available.
What’s coming next?
What do we do about it?
That’s our topic.
We’ll discuss the social media giants’ impact on American society—the quality of discourse, data security, privacy and other related topics.
- Should social media platforms be subject to legal liability as “publishers” responsible for curating their content?
- Has social media “dumbed down” our citizens?
- Should user agreements for free services (like pretty much all social media, plus Gmail, etc.) be revised to make it clearer why this stuff is “free?” How should they be changed?
Here’s a WaPo article on “browser extensions” to get you started.
And a NYT op-ed piece on how the recent FaceApp affair isn’t really an outlier.