Jump to content

US Politics: 2 Fash 2 Impeach


Morpheus

Recommended Posts

5 hours ago, Fez said:

Never tell The Snake the odds!

Also, he was a gunslinger! Like Brett Favre. Like Rex Grossman. You can't tell a gunslinger to hang up the big guns.

How fucking dare you compare The Snake to The Sex Cannon.

Link to comment
Share on other sites

46 minutes ago, Zorral said:

Why anybody thought that is hard to understand, considering her actions and her words.

She's changed as she's so desperate to be prez that she's made some truly astonishingly stupid decisions lately. That's why people keep wondering what the hell happened to her.

This article is entirely too benevolent towards Haley's backing of Trump in neglecting to mention how low she's gone (Trump is "truthful" and would "never knowingly lie"), but it gives some context.

https://www.politico.com/interactives/2021/magazine-nikki-haleys-choice/

Link to comment
Share on other sites

1 hour ago, DanteGabriel said:

I think I've figured out why Ted Cruz grew a beard. Eunuchs can't grow them, so that's the only evidence we have that his balls aren't sitting in a glass jar somewhere in Mar a Lago.

 

Face merkin IMO.

Link to comment
Share on other sites

56 minutes ago, Zorral said:

Why anybody thought that is hard to understand, considering her actions and her words.

Strongly disagree. She's a great candidate on paper and previously did a good job of playing both sides of the party. However, at the moment it would seem absolutely loyalty to Trump is the bar any candidate must meet, and she just ended that.

Link to comment
Share on other sites

Color me as skeptical that blind obedience to Trump is going to be a selling point in the 2024 GOP primary.  What Trump fans love about Trump is that he owns the libs.  But Trump is going to have very few avenues to continue doing that in the next four years.  Sure, he can complain about Biden on Fox News, but so what? 

These Senators and Governors can start making a name for themselves by actually doing something.  Attacking AOC, Pelosi, Biden, etc, is going to be a lot more popular than just parroting Trump's attacks on those people.

Link to comment
Share on other sites

Has someone brought up applying Ronna McDaniel's standards to Trump yet? I had to quit watching.

https://www.cbsnews.com/news/newsom-recall-rnc-pledge-250000/

Quote

"Governor Newsom's authoritarian measures, blatant overreach and complete mishandling of the COVID-19 pandemic have proven that he is woefully unqualified to lead the state of California," RNC Chair Ronna McDaniel said in a statement. "It is time the people use their Constitutional recourse to remove him from power."

 

Link to comment
Share on other sites

47 minutes ago, Tywin et al. said:

Strongly disagree. She's a great candidate on paper and previously did a good job of playing both sides of the party. However, at the moment it would seem absolutely loyalty to Trump is the bar any candidate must meet, and she just ended that.

On paper I wouldn't have voted her for anything, appointed her to anything or confirmed her for anything.

Link to comment
Share on other sites

2 minutes ago, Zorral said:

On paper I wouldn't have voted her for anything, appointed her to anything or confirmed her for anything.

Eh. He forgives people all the time as long as they genuflect  and sell out their wives. 

Link to comment
Share on other sites

Then there is the OTHER Supreme Court, the one whose decisions have the potential to rival the Supreme Court in DC.  Big question is, can Kalbear be appointed to it?

 

https://www.newyorker.com/tech/annals-of-technology/inside-the-making-of-facebooks-supreme-court?utm_source=nextdraft&utm_medium=email

 

On a morning in May, 2019, forty-three lawyers, academics, and media experts gathered in the windowless basement of the NoMad New York hotel for a private meeting. The room was laid out a bit like a technologist’s wedding, with a nametag and an iPad at each seat, and large succulents as centerpieces. There were also party favors: Facebook-branded notebooks and pens. The company had convened the group to discuss the Oversight Board, a sort of private Supreme Court that it was creating to help govern speech on its platforms. The participants had all signed nondisclosure agreements. I sneaked in late and settled near the front. “Clap if you can hear me,” the moderator, a woman dressed in a black jumpsuit, said.

Since its founding, in 2004, Facebook had modelled itself as a haven of free expression on the Internet. But in the past few years, as conspiracy theories, hate speech, and disinformation have spread on the platform, critics have come to worry that the company poses a danger to democracy. Facebook promised to change that with the Oversight Board: it would assemble a council of sage advisers—the group eventually included humanitarian 

activists, a former Prime Minister, and a Nobel laureate—who would hear appeals over what kind of speech should be allowed on the site. Its decisions would be binding, overruling even those of Mark Zuckerberg, the company’s founder. Zuckerberg said he had come to believe that a C.E.O. shouldn’t have complete control over the limits of our political discourse. “Maybe there are some calls that just aren’t good for the company to make by itself,” he told me.

In 2019, Facebook agreed to let me report on the process, and I spent eighteen months following its development. Last month, the board ruled on its first slate of cases, which dealt with, among other topics, the glorification of Nazis and misinformation about the coronavirus pandemic. In the next few months, it will decide an even larger question: whether Donald Trump should be cut off indefinitely from his millions of followers for his role in inciting the insurrection at the Capitol, on January 6th. Nathaniel Persily, a law professor at Stanford, told me, “How the board considers the issues and acts in that case will have dramatic implications for the future of the board, and perhaps for online speech in general.”

In the beginning, Facebook had no idea how the board would work. To come up with ideas, the company held workshops with experts in Singapore, New Delhi, Nairobi, Mexico City, Berlin, and New York. “My job was to go all over the world and get as much feedback as possible,” Zoe Darmé, who oversaw the consultation process, told me. At the workshop in New York, in the hotel basement, participants sat at tables of eight or nine and ran simulations of cases. I sat between Jeff Jarvis, a journalism professor, and Ben Ginsberg, a Republican lawyer who represented George W. Bush in Bush v. Gore.

For our first case, the moderator projected a picture of a smiling girl in a yearbook photo, with a cartoon thought bubble that read “Kill All Men.” Facebook had removed the post for violating its hate-speech rules, which ban attacks based on “sex, gender identity.” To many, this seemed simplistic. “It’s a joke,” one woman said. “There has to be an exception for humor.” Facebook’s rules did include a humor exception, for instances in which the user’s intent was clear, but it was difficult to discern this person’s motivation, and attendees worried that a broad carve-out for jokes could easily provide cover for hate speech. Carmen Scurato, who works at Free Press, an Internet-advocacy organization, pointed out the historical disadvantage of women, and argued that hate-speech policies ought to take power dynamics into account. In the end, the group voted to restore the photo, though no one knew exactly how to write that into a rule.

This kind of muddy uncertainty seemed inevitable. The board has jurisdiction over every Facebook user in the world, but intuitions about freedom of speech vary dramatically across political and cultural divides. In Hong Kong, where the pro-democracy movement has used social media to organize protests, activists rely on Facebook’s free-expression principles for protection against the state. In Myanmar, where hate speech has contributed to a genocide against the Rohingya, advocates have begged for stricter enforcement. Facebook had hoped, through the workshops, to crowdsource beliefs about speech, but the results were more contradictory than anticipated. In New York, for example, sixty per cent of people voted to reinstate the “Kill All Men” post, but only forty per cent did so in Nairobi. Amid other theories, Darmé speculated, “Where countries are perhaps more concerned about safety, because they live in an area with less rule of law—and therefore there’s a chance of a group actually maybe trying to kill all men—there’s less concern about free speech.” The full explanation is likely more complex; regardless, the divergent results underscored the difficulty of creating a global court for the Internet.

Some of the workshops devolved into disarray. In Singapore, Nairobi, and New Delhi, a few participants refused to sign the nondisclosure agreements, protesting Facebook’s lack of transparency; in Germany, someone commandeered the microphone and berated the company for killing democracy. “We had to learn to put on our body armor,” Darmé said. In New York, the session remained civil, but just barely. Some participants thought that the board would be ineffectual. “The whole thing seemed destined for failure,” Sarah T. Roberts, a professor of information studies at U.C.L.A., told me. “Skeptics will think it’s captured by the corporate interest of Facebook. Others will think it doesn’t do enough, it’s a pseudo-institution.” Some predicted the board would come to have grand ambitions. Tim Wu, a law professor at Columbia, said, “If the board is anything like the people invited to New York, I wouldn’t be surprised if it got out of control and became its own little beast that tried to change the world one Facebook decision at a time.”

Participants had been instructed to use an app called Slido to submit questions for group discussion, which could be voted up or down on the agenda. The results were projected on a screen at the front of the room. The app had worked well abroad, but in New York it became a meta-commentary on content moderation. Sophisticated questions about “automatic tools for takedowns” and the “equity principle with diverse communities” were soon overtaken by a joke about “Game of Thrones.” Posts were initially anonymous, but users quickly found a way around the system; “Harold” wrote, “I figured out how to identify self,” which provoked laughter. The moderator shouted to regain control of the room. In the midst of the chaos, someone posted, “Can we abandon Slido and talk?,” which quickly accumulated likes.

The idea that Facebook, like a fledgling republic, would need to institute democratic reforms might have seemed silly a decade ago. In 2009, shortly after the company was criticized for quietly changing its terms of service to allow it to keep users’ data even after they deleted their accounts, it released a video of Zuckerberg, clad in an uncharacteristic button-up shirt and a tie, announcing a “new approach to site governance.” People would be able to vote on Facebook’s policies; the company called it “a bold step toward transparency.” In the first referendum, on whether to change the terms of service, only 0.32 per cent of users voted. “In its own eyes, Facebook has become more than merely a recreational website where users share photos and wish each other a happy birthday,” a columnist for the Los Angeles Times wrote. “It is now a global body of citizens that should be united and protected under a popularly ratified constitution. But it’s hard to have a democracy, a constitution or a government if nobody shows up.” In 2012, the project was quietly shuttered, and, as with Crystal Pepsi, Google Wave, and the Microsoft Zune, no one remembers that it existed.

This was still a hazily optimistic time for Facebook. The company promised to “give people the power to share and make the world more open and connected.” As more users joined tech platforms, companies instituted rules to sanitize content and keep the experience pleasant. Airbnb removed housing ads that displayed Nazi flags; Kickstarter disallowed crowdfunding for “energy food and drinks”; Etsy told users to be “helpful, constructive, and encouraging” when expressing criticism. Facebook hired content moderators to filter out pornography and terrorist propaganda, among other things. But, because it saw itself as a “neutral platform,” it tended not to censor political speech. The dangers of this approach soon became apparent. Facebook now has some three billion users—more than a third of humanity—many of whom get their news from the site. In 2016, Russian agents used the platform in an attempt to sway the U.S. Presidential election. Three years later, a white supremacist in New Zealand live-streamed a mass shooting. Millions of people joined groups and followed pages related to QAnon, a conspiracy theory holding that the world is controlled by a cabal of Satan-worshipping, pedophilic Democrats. The First Amendment has made it difficult for the U.S. government to stop toxic ideas from spreading online. Germany passed a law attempting to curb the dissemination of hate speech, but it is enforceable only within the country’s borders. As a result, Facebook has been left to make difficult decisions about speech largely on its own.

VIDEO FROM THE NEW YORKER

Why U.S. Audiences Are Crazy for K-Pop

Over time, the company has developed a set of rules and practices in the ad-hoc manner of common law, and scholars have long argued that the system needed more transparency, accountability, and due process. The idea for the Oversight Board came from Noah Feldman, a fifty-year-old professor at Harvard Law School, who has written a biography of James Madison and helped draft the interim Iraqi constitution. In 2018, Feldman was staying with his college friend Sheryl Sandberg, the chief operating officer of Facebook, at her home in Menlo Park, California. One day, Feldman was riding a bike in the neighboring hills when, he said, “it suddenly hit me: Facebook needs a Supreme Court.” He raced home and wrote up the idea, arguing that social-media companies should create “quasi-legal systems” to weigh difficult questions around freedom of speech. “They could cite judicial opinions from different countries,” he wrote. “It’s easy to imagine that if they do their job right, real courts would eventually cite Facebook and Google opinions in return.” Such a corporate tribunal had no modern equivalent, but Feldman noted that people need not worry: “It’s worth recalling that national legal systems themselves evolved from more private courts administered by notables or religious authorities.” He gave the memo to Sandberg, who showed it to Zuckerberg. For a few years, Zuckerberg had been thinking about establishing a “legislative model” of content moderation in which users might elect representatives to Facebook, like members of Congress. A court seemed like a better first step.

In November, 2018, Feldman gave a short presentation to Facebook’s corporate board, at Zuckerberg’s invitation. “I didn’t feel like I was convincing my audience,” he told me. Feldman recalled that some members felt such a body wouldn’t sufficiently improve the company’s legitimacy; others worried that it could make decisions that would contradict Facebook’s business interests. A few minutes in, Zuckerberg defended the proposal. He noted that a huge proportion of his time was devoted to deliberating on whether individual, high-profile posts should be taken down; wouldn’t experts be better at making those decisions? The idea remained controversial, but Facebook’s corporate structure allows Zuckerberg to make unilateral decisions. Soon after, he ordered the project to begin. “I was kind of stunned,” Feldman told me. “Like, holy shit, this is actually going to happen.”

One day in June, 2019, an Uber dropped me off at Facebook’s campus, in a parking lot full of Teslas. For the past couple of years, while working as a law professor, I had been researching how tech companies govern speech. That morning, I headed into MPK 21, a five-hundred-thousand-square-foot building designed by Frank Gehry, with a rooftop garden inhabited by wild foxes. (Signs discourage interaction with them.) Walls are plastered with giant posters bearing motivational phrases like “Nothing at Facebook Is Somebody Else’s Problem” and “The Best Way to Complain Is to Make Things.” When you visit, you register at a touch-screen kiosk and sign a nondisclosure agreement pledging that you won’t divulge anything you see. The company knew that I was coming as a reporter, but the woman at the desk didn’t know how to print a pass without a signed agreement; eventually, another employee handed me a lanyard marked “N.D.A.” and said, “We’ll just know that you’re not under one.”

I began by shadowing Facebook’s Governance and Strategic Initiatives Team, which was tasked with creating the board. The core group was made up of a dozen employees, mostly in their thirties, who had come from the United Nations, the Obama White House, and the Justice Department, among other places. It was led by Brent Harris, a former consultant to nonprofits who frequently arrived at our meetings eating a granola bar. The employees spent much of their time drafting the board’s charter, which some called its “constitution,” and its bylaws, which some called its “rules of the court.” During one meeting, they used pens, topped with a feather, to evoke the quills used by the Founding Fathers.

The group was young and highly qualified, but it was surrounded by tech executives who sometimes became actively involved. Early drafts of the charter included a lot of dry, careful legal language, but in later versions some of it had been stripped out. “Feedback is coming from people high in the company, who are not lawyers,” Harris told me, during one meeting. I noted that someone had changed all references to “users” in the charter to “people,” which seemed to imply that the board governed not only Facebook’s customers but everyone in the world. Harris exchanged glances with another employee. “Feedback is coming from people very high in the company,” he said. I later learned from the team that Zuckerberg had been editing the charter to make it “more approachable.”

Employees on the governance team sometimes referred to themselves as “true believers” in the board. Kristen Murdock, who was an intelligence officer in the Navy before coming to Facebook, told me, “This is going to change the face of social justice on the Internet.” But some executives did not hold it in the same regard. Elliot Schrage, then the head of global policy and communications, told people involved that he was skeptical of the project and did not think it could be improved. (Schrage claimed, through a spokesperson, that he was “fully supportive of efforts to improve governance” but that he “did have concerns about how to build a credible program.”) Nick Clegg, a former Deputy Prime Minister of the U.K. who was supervising the governance team, told me, in 2019, that he was reluctant to let the board weigh in on sensitive topics, at least early on. “I would love to think that we’d have a relatively uncontroversial period of time,” he said. At one point, a director of policy joked about ways to make the board seem independent, asking, “How many decisions do we have to let the Oversight Board win to make it legit?”

In time, the workings of the court came together. The board originally included twenty members, who were paid six-figure salaries for putting in about fifteen hours a week; it is managed by an independent trust, which Facebook gave a hundred and thirty million dollars. (“That’s real money,” a tech reporter texted me. “Is this thing actually for real?”) According to Facebook, as many as two hundred thousand posts become eligible for appeal every day. “We are preparing for a fire hose,” Milancy Harris, who came to the governance team from the National Counterterrorism Center, said. The board chooses the most “representative” cases and hears each in a panel of five members, who remain anonymous to the public. Unlike in the Supreme Court, there are no oral arguments. The user submits a written brief arguing her case; a representative for the company—“Facebook’s solicitor general,” one employee joked—files a brief explaining the company’s rationale. The panel’s decision, if ratified by the rest of the members, is binding for Facebook.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

Guest
This topic is now closed to further replies.
×
×
  • Create New...