The 2020 Election Will Take Place on Two Different Internets ...
There are now more people online than ever before, using the internet more heavily than ever before, and the internet is no longer a sideshow or afterthought for political campaigning. Though the idea of old platforms getting usurped by newer, shinier ones with younger, more engaged users is not a new phenomenon — Facebook supplanted MySpace after all — the balkanization of platform identity will never be on greater display than in the upcoming election.
If Facebook represents a barely regulated arena of “angry boomer” politics, younger-skewing counterparts like Instagram, or Snapchat, or TikTok (all of which are mostly free from the feature bloat and confusing maze of services that Facebook offers, and which — as a result — do not suffer misinformation at the scale Facebook does) represent simplicity and clarity. If we assume that younger people tend to skew liberal, as studies have shown, then we can relatively safely assume younger-skewing app user bases will likely skew liberal as well. And if we assume younger people are less susceptible to misinformation, then we can assume the platforms they prefer are, too. It’s difficult to say how these platforms will affect political discourse. Instagram and Snapchat are arguably more about private messaging than algorithmic publishing, and that makes tracking sentiment and deploying programmatic advertising difficult, if not useless. A platform like TikTok, borne out of teen-favorite app Musical.ly, is loaded with first-time voters who get their information almost exclusively from online sources. Whatever shape candidate messaging takes there will be interesting to see.
Our conception of filter bubbles in the past has been as balkanized communities within a single online platform, carved up by algorithm recommendations. Red feed, blue feed. Going forward, we might see online communities separated into even stricter online buckets: different platforms entirely. This situation makes supposed cure-alls for discursive problems — exposing oneself to opposing viewpoints, civil debate, whatever — far more difficult to execute. How would you imagine an older person might encounter a younger person’s views on an app like Snapchat, where private messaging is the main form of communication, or TikTok, where … well, who knows what the hell Gen Z is doing on TikTok? For many boomers, Facebook might as well be the entire internet, and these sorts of demographic divides entrench Trump’s ability to activate older, passionate voters on Facebook, a group whose members are more likely to be conservative, without any counter.
Granted, it’s not like the platforms who appeal to all ages are any help. In the middle of these two poles, forming a sort of age-demographic no-man’s land, are Twitter (a platform that’s loud and fast-moving, and where fanatics talk past each other, and boomers and millennials fight for supremacy) and YouTube (which has never met an audience it won’t cater to, short of literal Nazism). Those are hardly solutions for people who worry that the upcoming election, and online self-separation, will only make people angrier and more entrenched. For Trump, however, that’s part of a winning strategy.
20 posts • Page 1 of 1
New York Mag - Brian Feldman
WaPo - Brian Klaas
Deepfakes are coming. We’re not ready.
If 2016 was the election of “fake news,” 2020 has the potential to be the election of “deepfakes,” the new phenomenon of bogus videos created with the help of artificial intelligence. It’s becoming easier and cheaper to create such videos. Soon, those with even a rudimentary technical knowledge will be able to fabricate videos that are so true to life that it becomes difficult, if not impossible, to determine whether the video is real.
In the era of conspiracy theories, disinformation and absurd denials by politicians staring down seemingly indisputable facts, it is only a matter of time before deepfakes are weaponized in ways that poison the foundational principle of democracy: informed consent of the governed. After all, how can voters make appropriate decisions if they aren’t sure what is fact and what is fiction? Unfortunately, we are careening toward that moment faster than we think.
Deepfakes are created by something called a “generative adversarial network,” or GAN. GANs are technically complex, but operate on a simple principle. There are two automated rivals in the system: a forger and a detective. The forger tries to create fake content while the detective tries to figure out what is authentic and what is forged. Over each iteration, the forger learns from its mistakes. Eventually, the forger gets so good that it is difficult to tell the difference between fake and real content. And when that happens with deepfakes, those are the videos that are likely to fool humans, too.
Of course, fakes and forgeries are not new. Whether it was the Soviets airbrushing out “undesirables” or Hollywood special effects, convincing imitations of reality have been around for a while. But in both instances, there were only a few masters of the trade who could pull off a convincing fake. Deepfakes, on the other hand, require little technical expertise, meaning that virtually anyone with the right software will be able to make any fake video of just about any person seemingly saying whatever they want.
That democratization of forgery is just around the corner. “I would say within another 18 to 24 months, that technology is going to get to a point where the human brain may not be able to decipher it,” Hany Farid, a professor of computer science at Dartmouth College, recently told me. Soon, the forger will consistently fool us.
Given how poorly our democracies performed with easily debunked fake-news articles about, say, the pope endorsing President Trump, the prospect of having to question videos that we can see with our own eyes is even more harrowing.
Mark Zuckerberg personally made the decision that Facebook would keep running political ads, even though the ads were weaponized in 2016
As the US gears up for its next presidential election in 2020, citizens can expect to keep seeing political ads on Facebook.
The company will continue to run political ads, The Wall Street Journal reported on Thursday, even as it tries to move past the scandals from its social-networking platforms being used to manipulate voters in major elections worldwide in recent years.
Facebook is making some changes to protect its political ads from being exploited by Russia or other bad actors. The company will no longer pay commissions to salespeople for selling political ads, The Journal reported.
That's a big about-face from 2016, when Facebook not only paid commissions but also embedded its staffers into campaigns to help them with their Facebook targeting strategies. It offered such white-glove service to both the Trump campaign and the Clinton campaign. Since Trump's campaign was smaller and less digitally savvy, it used this service heavily. Brad Parscale, Trump's 2020 campaign manager, even praised Facebook for helping it raise money, Wired reported in 2016. And Trump's digital-advertising director, Gary Coby, called one of Facebook's staffers his MVP.
Political ads were once viewed as a promising growth area for Facebook, but now, as Facebook faces increasing scrutiny over how it handles user data, such ads have fallen out of favor internally, one employee told The Journal.
Doctored Pelosi videos offer a warning: The internet isn't ready for 2020
The Pelosi videos and their narratives were not the product of advanced technology, nor did they take a different route to prominence than previous misinformation efforts.
The 2020 election is set to face some very 2016 challenges when it comes to the spread of misinformation.
The emergence of distorted videos of House Speaker Nancy Pelosi, edited to make her appear to have trouble speaking, has provided a stark reminder that technology often remains an enemy of truth in politics, just as it was four years ago. The core issues of social media virality, confirmation bias and the fringe internet-to-conservative media pipeline have endured from 2016 and do not even need particularly sophisticated techniques to do real mischief.
The videos also offer a warning that concerns about election interference from foreign countries should not overshadow the ability of domestic actors to influence what people see, hear and think. President Donald Trump himself distributed one of the carefully edited videos on Twitter on Friday morning, and though he denied knowing that they were altered, he continued pushing their underlying theme that Pelosi is somehow impaired.
Alex Stamos, Facebook’s former chief security officer, said that the Pelosi video illustrated two of the biggest risks ahead of the 2020 election: lightly edited “shallow fakes” that use real and often lightly edited footage to manipulate the discourse, and the power of domestic misinformation efforts.
“A video that is slightly deceptively edited can be the source of very divisive and possibly false narratives,” said Stamos, who is an NBC News contributor and adjunct professor at Stanford University. “With all of our focus on the Russians, we’ve forgotten that domestic actors are the most likely to have insight into the kinds of narratives that will become viral and will have a big impact on people’s beliefs in American institutions and individuals.”
Why it’s so alarming that Trump shared an edited video of Pelosi
DID PRESIDENT TRUMP share a fake clip of Nancy Pelosi? The seemingly simple question is a vexing one to answer. Passing judgment on his behavior is less challenging.
Conservative accounts on social media circulated a clip this week deliberately distorted to make it seem as if the speaker of the House was slurring her speech: “Drunk as a skunk,” commentators declared. The video, which some declared a “deepfake,” employed much too simple technology to merit that term. Deepfakes use artificial intelligence to synthesize human images into a reality that is entirely fabricated; the smear of Ms. Pelosi merely slowed down parts of an existing interview and modified her pitch.
The clip Mr. Trump tweeted alongside the words “PELOSI STAMMERS THROUGH NEWS CONFERENCE” was part of the same narrative, but it was not distorted, or even doctored, so much as it was edited. The clip splices together short segments of Ms. Pelosi (D-Calif.) stuttering in a lowlight reel that offered a misleading impression of a perfectly coherent 21-minute news conference. Mr. Trump did not make this video, or pull it from the right-wing fever swamps of social media. He took it instead from the fever swamp of Fox Business Network.
The clamor for firms such as YouTube, Facebook and Twitter to remove or limit the distribution of these clips as misinformation invites a vexing debate about what counts as fake in the first place. The “slurring” video, accompanied by manufactured accusations of drunkenness, may fall on one side of the line. The stammering video may fall on the other. But drawing that line at all has far-reaching implications. People edit videos all the time, sometimes for fun and sometimes to prove a political point. When does editing become doctoring, and when does doctoring become distorting? Is distorting always impermissible, or does it depend on intent, effect or something else altogether?
These difficulties both are caused by and contribute to the erosion of trust in today’s America, where it is hard to say what there is more of: false cries of “fake news,” or viral “news” that is actually fake. Technology certainly has helped this issue along, providing both an easy means to craft propaganda and an easy means to promote it. The increasing sophistication of image editing that creates the threat of actual deepfakes filling the Web will make that worse.
DNC urges 2020 campaigns to 'immediately' delete Russian-developed FaceApp from their phones
Don't expect to see any FaceApp memes coming from the 2020 Democrats anytime soon.
The Democratic National Committee on Wednesday urged every 2020 campaign not to use FaceApp, the popular app that allows users to apply filters to photos and has recently been used on social media to age-up pictures, noting it was "developed by Russians," CNN reports. The app was created by Wireless Lab, which is based in Russia.
The DNC's chief security officer, Bob Lord, told the 2020 campaigns the organization has "significant concerns about the app (as do other security experts) having access to your photos, or even simply uploading a selfie." Concerns were previously raised about the app, which notes in its privacy terms that by using it, you "consent to the processing, transfer and storage of information about you in and to the United States and other countries," The Washington Post reports. The company said on Wednesday that "the user data is not transferred to Russia."
Lord said in his warning to 2020 campaigns that "it's not clear at this point what the privacy risks are," per CNN, but that "the benefits of avoiding the app outweigh the risks."
well, why should they have to transfer your data to Russia when they have all the trolls and Russian intelligence they need right here in the good ole US of A?
"I know that human being and fish can coexist peacefully"
--- George W Bush
--- George W Bush
Insert signature here: ____________________________________________________
A Trump social network readies for launch
Sometimes Twitter isn’t enough.
After spending years alleging anti-conservative bias on social media, President Donald Trump will soon have another way to get his message out how he wants.
Trump’s reelection campaign plans to launch a smartphone app this fall to encourage supporters to donate, volunteer and reel in like-minded voters — all while providing the president more unfiltered access to his followers. Supporters who download the all-in-one app are expected to be able to sign up for a Make America Great Again rally, canvas a neighborhood or call voters, maybe even register to vote as the campaign looks to turn passive supporters into activists.
Perhaps the most important feature will be the app’s use of prizes — maybe VIP seats or a photo with Trump — to persuade the most fervent supporters to recruit their friends, rewarding them as campaigns have been doing for top donors for years, according to people familiar with the plans.
The upcoming launch is the latest sign of how Trump’s team, which ran a ragtag operation in 2016, is using its huge coffers to drive a more professional and data-driven operation. While campaigns have had apps for years, the Trump app is expected to let the president‘s team track followers in a more comprehensive way than ever before in its bid to secure the president another White House term.
That reminds me of those marks who signed up with Trump university, where one of the perks was supposed to be a photo taken with Trump so you would meet him and such. What they actually got was a photo taken with a Trump cardboard Stand up photo. You think Fuckface (nothing against Faces) would stand there for a couple of hours to meet a few thousand of his biggest fans?
Learn to Swear in Latin. Profanity with class!
https://blogs.transparent.com/latin/lat ... -in-latin/
https://blogs.transparent.com/latin/lat ... -in-latin/
Inside Elizabeth Warren's Selfie Strategy
In late March, Jocelyn Roof, a sophomore at the University of Iowa, picked up a call from an unknown number and heard Senator Elizabeth Warren’s voice on the other end of the line. Warren asked her what got her “in this fight”—Roof said she was very concerned about income inequality—and thanked Roof for her $25 donation.
As soon as she got off the phone, Roof took a selfie of her shocked face. She posted it to Snapchat with the caption “MY WHOLE LIFE WAS MADE,” then screenshotted it and posted it to Twitter. Warren retweeted the selfie, with the comment “I’m so glad we got to talk!”
Before the call, Roof said she was undecided about who she would support in the Iowa caucuses. Afterwards, her enthusiasm for Warren “skyrocketed,” she says. “I couldn’t imagine voting for anyone else.” She started donating $5 to the campaign every month and buying snacks for volunteers at field offices. In the first three weeks of September, she registered more than 1,000 new voters on her University of Iowa campus, independent of the Warren campaign.
On Sept. 20, Roof was wearing a “Women for Warren” shirt as she waited in line to take a selfie with the Senator after a rally on her campus. The call and the tweet had made a young undecided Iowa voter into an avid supporter, grassroots donor and potential volunteer.