OnlyFans, a creator content sharing platform, has had a surplus of new accounts and new traffic to their website recently. OnlyFans has generous community guidelines that allows creators to post almost anything, including porn, behind a paywall. Big stars and influencers such as Cardi B and Tana Mongeau, have been utilizing the platform recently as… Date: 2020-12-11 13:33:17🔎 Company now hiring online customer service representatives! PAY is $20 hourly! Work-at-home providing customer service for pet owners who are using a mobile app. Company offers competitive pay, benefits, paid training, paid
Jeremy Blackburn, Binghamton University, State University of New York; Robert W. Gehl, Louisiana Tech University, and Ugochukwu Etudo, University of Connecticut
In the wake of the assault on the U.S. Capitol on Jan. 6, Twitter permanently suspended Donald Trump’s personal account, and Google, Apple and Amazon shunned Parler, which at least temporarily shut down the social media platform favored by the far right.
Dubbed “deplatforming,” these actions restrict the ability of individuals and communities to communicate with each other and the public. Deplatforming raises ethical and legal questions, but foremost is the question of whether it’s an effective strategy to reduce hate speech and calls for violence on social media.
The Conversation U.S. asked three experts in online communications whether deplatforming works and what happens when technology companies attempt it.
Jeremy Blackburn, assistant professor of computer science, Binghamton University
The question of how effective deplatforming is can be looked at from two different angles: Does it work from a technical standpoint, and does it have an effect on worrisome communities themselves?
Does deplatforming work from a technical perspective?
Gab was the first “major” platform subject to deplatforming efforts, first with removal from app stores and, after the Tree of Life shooting, the withdrawal of cloud infrastructure providers, domain name providers and other Web-related services. Before the shooting, my colleagues and I showed in a study that Gab was an alt-right echo chamber with worrisome trends of hateful content. Although Gab was deplatformed, it managed to survive by shifting to decentralized technologies and has shown a degree of innovation – for example, developing the moderation-circumventing Dissenter browser.
From a technical perspective, deplatforming just makes things a bit harder. Amazon’s cloud services make it easy to manage computing infrastructure but are ultimately built on open source technologies available to anyone. A deplatformed company or people sympathetic to it could build their own hosting infrastructure. The research community has also built censorship-resistant tools that, if all else fails, harmful online communities can use to persist.
Does deplatforming have an effect on worrisome communities themselves?
Whether or not deplatforming has a social effect is a nuanced question just now beginning to be addressed by the research community. There is evidence that a platform banning communities and content – for example, QAnon or certain politicians – can have a positive effect. Platform banning can reduce growth of new users over time, and there is less content produced overall. On the other hand, migrations do happen, and this is often a response to real world events – for example, a deplatformed personality who migrates to a new platform can trigger an influx of new users.
Another consequence of deplatforming can be users in the migrated community showing signs of becoming more radicalized over time. While Reddit or Twitter might improve with the loss of problematic users, deplatforming can have unintended consequences that can accelerate the problematic behavior that led to deplatforming in the first place.
Ultimately, it’s unlikely that deplatforming, while certainly easy to implement and effective to some extent, will be a long-term solution in and of itself. Moving forward, effective approaches will need to take into account the complicated technological and social consequences of addressing the root problem of extremist and violent Web communities.
Ugochukwu Etudo, assistant professor of operations and information management, University of Connecticut
Does the deplatforming of prominent figures and movement leaders who command large followings online work? That depends on the criteria for the success of the policy intervention. If it means punishing the target of the deplatforming so they pay some price, then without a doubt it works. For example, right-wing provocateur Milo Yiannopoulos was banned from Twitter in 2016 and Facebook in 2019, and subsequently complained about financial hardship.
If it means dampening the odds of undesirable social outcomes and unrest, then in the short term, yes. But it is not at all certain in the long term. In the short term, deplatforming serves as a shock or disorienting perturbation to a network of people who are being influenced by the target of the deplatforming. This disorientation can weaken the movement, at least initially.
However, there is a risk that deplatforming can delegitimize authoritative sources of information in the eyes of a movement’s followers, and remaining adherents can become even more ardent. Movement leaders can reframe deplatforming as censorship and further proof of a mainstream bias.
There is reason to be concerned about the possibility that driving people who engage in harmful online behavior into the shadows further entrenches them in online environments that affirm their biases. Far-right groups and personalities have established a considerable presence on privacy-focused online platforms, including the messaging platform Telegram. This migration is concerning because researchers have known for some time that complete online anonymity is associated with increased harmful behavior online.
In deplatforming policymaking, among other considerations, there should be an emphasis on justice, harm reduction and rehabilitation. Policy objectives should be defined transparently and with reasonable expectations in order to avoid some of these negative unintended consequences.
Robert Gehl, associate professor of communication and media studies, Louisiana Tech University
Deplatforming not only works, I believe it needs to be built into the system. Social media should have mechanisms by which racist, fascist, misogynist or transphobic speakers are removed, where misinformation is removed, and where there is no way to pay to have your messages amplified. And the decision to deplatform someone should be decided as close to democratically as is possible, rather than in some closed boardroom or opaque content moderation committee like Facebook’s “Supreme Court.”
In other words, the answer is alternative social media like Mastodon. As a federated system, Mastodon is specifically designed to give users and administrators the ability to mute, block or even remove not just misbehaving users but entire parts of the network.
For example, despite fears that the alt-right network Gab would somehow take over the Mastodon federation, Mastodon administrators quickly marginalized Gab. The same thing is happening as I write with new racist and misogynistic networks forming to fill the potential void left by Parler. And Mastodon nodes have also prevented spam and advertising from spreading across the network.
Moreover, the decision to block parts of the network aren’t made in secret. They’re done by local administrators, who announce their decisions publicly and are answerable to the members of their node in the network. I’m on scholar.social, an academic-oriented Mastodon node, and if I don’t like a decision the local administrator makes, I can contact the administrator directly and discuss it. There are other distributed social media system, as well, including Diaspora and Twister.
The danger of mainstream, corporate social media is that it was built to do exactly the opposite of what alternatives like Mastodon do: grow at all costs, including the cost of harming democratic deliberation. It’s not just cute cats that draw attention but conspiracy theories, misinformation and the stoking of bigotry. Corporate social media tolerates these things as long as they’re profitable – and, it turns out, that tolerance has lasted far too long.
Jeremy Blackburn, Assistant Professor of Computer Science, Binghamton University, State University of New York; Robert W. Gehl, F. Jay Taylor Endowed Research Chair of Communication, Louisiana Tech University, and Ugochukwu Etudo, Assistant Professor of Operations and Information Management, University of Connecticut
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Online Sex Work is Becoming Mainstream Thanks to Onlyfans
Onlyfans, the content sharing platform, has began to popularize online sex work.
Graphic made by Jaylen Minnich
OnlyFans, a creator content sharing platform, has had a surplus of new accounts and new traffic to their website recently. OnlyFans has generous community guidelines that allows creators to post almost anything, including porn, behind a paywall. Big stars and influencers such as Cardi B and Tana Mongeau, have been utilizing the platform recently as an extra cashflow. Many viral tweets and TikToks have been marketing the website to the public. Although this platform can be used for other content, lately sex work has been the most common.
According to SimilarWeb, OnlyFans has increased its total visits by almost 50 million since April 2020. Currently, it’s averaging around 25 million visits per month. About 65% of these visits are coming from direct searches but about 20% are coming from social media. Viral tweets about success stories are a major factor in this.
In 8 months Onlyfans got me a place on the beach in LA, a new car, savings, boobs, and money to spoil my family w gifts/surprise vacations. I work on my own time, stay within my comfort zone, have a boyfriend, and am getting a degree in video, animation, & audio engineering. Lol
— 🕊 (@dovenymph) September 18, 2020
With these positive attitudes towards sex work becoming more commonplace people are creating more and more accounts. Just searching “Onlyfans” on twitter gives a plethora of links from a wide array of people.
Domenique Dominguez, 23, theatre and music major started their OnlyFans account back in May.
“I had been considering making one since it [OnlyFans] first started popping off,” they said. “But I was in a relationship with someone who wasn’t comfortable with it.”
Dominguez was receiving requests on twitter for an OnlyFans account. So after the relationship ended, they began posting explicit content consisting of nude or semi-nude photos, videos and sometimes personal requests.
“I figured if I could make money why would I not,” they said. “I’m comfortable in my body and sharing it on my own terms.”
However, because of the recent popularity of online sexwork, there has been a recent increase in negative comments and backlash. One creator, @_cinnamonro11_ , on twitter, was asked through a Tweet if there were any downsides to an OnlyFans account.
Honestly yes there are so many…. I don’t like that just anybody can screenshot your content, the OF support is apparently trash, and I also don’t like the way some of my subscribers approach me in my messages demanding additional free content just because they subscribed
— 🕊 (@dovenymph) September 18, 2020
Supporting sex work has become a major topic of discussion in the media thanks to OnlyFans. The public is beginning to accept sex work as a valid form of financial security, especially amongst those who are struggling due to the pandemic. However, this does go against the traditional stance that sex work is unprofessional.
“I want to be a performer with an OnlyFans,” said Dominguez. “I want to normalize it. I’m only adding to the problem if I stop myself in fear of my director or professor seeing it.”
“Sometimes I get scared, but it’s work. I think it’s a totally acceptable thing to do. I want to assist in normalizing that,” they added.
The upheaval of the implicit bias surrounding OnlyFans is starting to change people’s perceptions.
Sex work is entering the mainstream. Young people are trying to cultivate a society that recognizes and values the skills and work these kinds of jobs entail. They’re not just last resort options, but whole businesses.
Author: Jaylen Minnich Hall
$20 Hourly HIGH-PAY Work-From-Home Jobs. Email & Phone Customer Service Apply NOW!!
Date: 2020-12-11 13:33:17
🔎 Company now hiring online customer service representatives! PAY is $20 hourly! Work-at-home providing customer service for pet owners who are using a mobile app. Company offers competitive pay, benefits, paid training, paid parental leave, and paid time off! Full-time hours available. Great remote career opportunity. Watch the full video for details, then apply ASAP! This lead will go FAST!!
✔ Watch my FREE online resume and cover letter webinar. Learn how to create APPEALING resumes and cover letters for online jobs!:
✔ Download my FREE 10-page work-from-home checklist that explains everything you need to work online:
💜Join Our FB Group:
(BUSINESS ONLY) if interested in promotion, sponsorship, or collaboration, please email email@example.com
#workathome #workfromhome #onlinejobs #jobs #Veterinary #pets #PetDesk #stayhome #withme #workathome #nowhiring #makemoneyonline #job #career #telecommute #telework #remoteworklife #DelilahBell