Sunday, November 27, 2016

Draft Report


Negativity online - what is it and why is it a problem?
Every day we wake up to see at least a couple of bad news which show up first on either Facebook or news websites. Checking the comments or discussions on social media, we see a lot of people who hurl abuses and insult each other. Result? A demoralized person.

Nowadays, a large portion of the content posted and shared on social media presents negative news and ideas. The rants, arguments and insults that are common in the comment sections only add to the negative ambience online. This bombardment of continuous negativity affects the users of social media, especially their mental well-being. Yet, sometimes a good news headline, or constructive discussions tend to pop up. But such positive content is shared on a miniscule level as compared to the negative content on social media. Many people nowadays feel that they are overwhelmed with negativity. This may cause many problems including depression, anxiety, stress and even post-traumatic stress disorder (PTSD) [3, 5], as negativity strongly affects the mental well-being of a person and decreases the quality of one’s life.

There is a public sentiment that social media is increasingly becoming negative. Moreover, rude behavior and online fights are a commonplace. Public discussions have been sparked by some extreme cases.

For example, several online bullying cases have surfaced on Ask.fm, a social networking website to ask questions. Several people have been victims of anonymous threats of violence, rape and death leading to tragedy. Below is an example of a victim of cyberbullying who committed suicide.

Hannah Smith, aged 14, was found hanged in her bedroom in Lutterworth, Leicestershire, by her sister Jo, 16. Her father, Dave, said she had been savagely bullied on the question-and-answer website Ask.fm.
"I have just seen the abuse my daughter got from people on Ask.fm and the fact that these people can be anonymous is wrong," Smith, 45, wrote on his Facebook page.
(From: The Guardian - Ask.fm: is there a way to make it safe [2])

Cyberbullying and other negative behaviour exists not only on Ask.fm on all social media platforms including but not limited to Facebook, Twitter, Instagram, Snapchat, Whatsapp etc.

Everyone active on the internet and social media is affected by negative content to some extend. At some point, everyone comes across rude behaviour, cyberbullying, and bad news. The effect that negative content produces on different people may differ: while some can easily move on, others are strongly affected by what they see and read or by the behaviour they experience from others.

Our group and goals

We are a diverse and a multicultural group of international students from Vietnam, India, Russia, Romania. This enables us to understand the problem in different viewpoints.
We, as a group, value constructive discussions, the right to criticize, but not bully, and positivity online. We believe everyone should have a right to control the level of negativity around them. Consequently, one has a right to consume the type of content they want.  

Our goal is to reduce negativity on social media and promote positive content. Examples of this ‘negativity’ include hate speech, rude comments, repetitive and useless bad news, uncensored brutal images, etc.

As we are dealing with negativity online, in itself something that is difficult to quantify, we feel we cannot provide any clear-cut numeric way to measure our success. However, we have found the following metrics appropriate to understand the impact of our service:

  • User feedback: do the users feel they can reduce the negativity in their online?
  • Tracking the number of positivity/negativity related hashtags
  • Tracking the number of our service users, which if increasing, is a success

Background

Stakeholders of creating and promoting negative content


We have tried to describe the stakeholders of content posted on social media in the following image:


This is a logical grouping, based on the roles people take when it comes to creating and spreading content on social media. All the groups influence the content/behaviours that are allowed or that become popular. Furthermore, each group has some role in making any content popular. There could be groups that are opposite to each other. For example, content consumers can be grouped into two opposing groups, where one has constructive interactions, and the other group often vandalizes. Moreover, we have the groups “Enablers” and “Policing/Admins” which have a major role affecting how content gets spread and have control over content consumers.

The grouping presents several advantages when considering our task, attempting to reduce negative content and behaviours. Firstly, it helps us figure out the details of the societal problem we are focusing on and ‘who’ is involved in perpetuating the problem. Secondly, it helps us understand roles in creating and distributing content and how the representatives of these roles interact with each other. Overall, we are looking to provide a win-win solution to every stakeholder and having a clear understanding of the actors will help us achieve our goal.

Current initiatives and solutions


Several social media platforms allow for mechanisms to report activities such as cyberbullying, obscene content, etc. Such content gets removed only when people report it and it comes to the notice of authorities. Till that time, people are still exposed to negative content.

Some people on social media are trying to promote positive content in order to cater to those who seek for such content voluntarily. These services are available as pages and groups (on Facebook or similar websites) or full fledged websites, which often post only positive content and good news. A few examples of such services include - Brightside, The Better India, etc.

There are a some services that try to create a completely new social network based on social psychology principles to create a positive social media environment. Such services ways to instill positivity based on credible social psychology research. One such example is the YOU-app [4]. It is a completely new social network which promotes positive actions and mindfulness.  

Restrictions and limitations


We have considered the following restrictions and limitations for our solution:

Style of current business - There is a reason negative content spreads so rapidly on social media. Negative content quickly catches people’s attention and the current style of doing business online is all about attention. Unfortunately, we cannot remove negative content, nor can we stop negative behaviours. What we are trying to do is give people an option for having less negativity thrown at them, is they choose so.

Legal issues - As we are planning to influence the content the users will see, we must carefully consider all the legal ramifications.

Schedule - Since we do not have an opportunity to develop this service in this course, we are only going to come up with a concept of the service.

Creation of information bubbles - One issue that arises from blocking negative content is the creation of information bubbles for the user. Exposing oneself to only a certain type of content may have a skewed view of the world. We are presently considering ways to avoid these information bubbles. For examples, the user could be presented with the content once, and then choose to hide it in the future rather than preemptively hiding all content on a topic.

Solution concept and approach

Our solution


We have named our solution “The Happy Echo”. The idea is to filter negative content from the user’s display based on his/her preferences.


The Happy Echo is a service that would work with social media accounts, in browsers or in apps, to curate and display the content according to the users’ wishes. The service would act as a middleware between the content and the user. When the user is browsing social media websites, then the content would first go through our service for it to be analyzed and modified according to the user’s preferences before being displayed to the user. The following image is an detailed representation of the service:


The most important features of the service are be as follows:

  • Filtering - Based on the preferences, the service can hide or suppress negative content. The filtering can be based on, for example, user generated keywords, or machine learning algorithms.
  • Re-sorting and highlighting - The content on a page can be re-sorted or highlighted on the page so the user can view the positive content more easily than the negative content.
  • Tracking -  The service shall be able to be configured to track user behaviour and actions and then make suggestions based on that. For example, if the user is being shown too much negative content, the service should be able to detect that and suggest similar but positive content.
  • Settings -  The most important aspect of any service is its flexibility to adapt to user preferences. The user should be able to configure the service properly. These configuration settings could include settings for - the level of filtering, adding keywords, re-sorting and highlighting, the level of information access available to the service, etc.

With the rise in the efficiency of machine learning and artificial intelligence algorithms nowadays, it is imperative to make use of it. The service should be able to tell the difference between what classifies as a negative content and what should be considered positive content. For that matter, there should also be a provision for crowdsourced training of the service in order to allow the user of the service to flag negative content. Based on intelligent algorithms and user flagging, the service could be much more efficient in detecting negative content.

As a result, users will be exposed to less negative content and have easier access to positive content. In turn, this will create less stress and healthier content consumers. In addition, the solution would work with the existing social media accounts, would be easy to integrate and use. Most importantly, the service could help deliver content based on one’s clearly defined preferences.

What makes us different?


We are trying to fix negativity in existing social media channels instead of creating a new social network. It is difficult for users to switch over to use a completely new social network. More importantly, the users themselves do not want to usually change after getting accustomed to a dominant social media platform for a long time. Therefore, we think that it is best we try to improve the existing setup. Apart from that, we are also trying to give users a full control over the content they consume. The user will have a full control over their preference settings over what content to be filtered and what should be displayed.

Initial target group


The target group for us currently is students. Younger people are more active on the Internet and use several social media platforms often. Statistics say that around 25% of Facebook’s users are in the age group of 18-24, the usual student age. Other services like Instagram, Snapchat, etc. possibly have similar statistics [1]. We have selected students because they are more open minded and open to a change. Students often try out new technologies which would help us in early adoption of our concept.

Since we have selected “Students” as our target group, it is still a huge area. As a preliminary target market (for the first release), we will chose Aalto University Students. Thus, it would be easier to get started since students from the same university are more like to try out the solution and give feedback for improvement.

Ideally, the solution could be used by any social media user in need of it, regardless of age, gender, social category etc.

Minimum viable product and initial business model


The minimum viable product is a browser extension that acts as a middleware between the users and the content they consume. Later this would be expanded to include other technology platforms like mobile, smart devices, etc.

As our goal is to solve a problem that exists in society, therefore we believe at least the minimum functionality should be provided free of charge for the end users. Depending on user needs and the response to our solution, we can later on introduce more advanced features for a small fee. One such advance feature could be providing custom built preference packs to control the type of content to be filtered. For example, a children's preferences pack that filters content inappropriate for children.

The main cost will be the development of the service. Additional costs can come from, for example, marketing, handling of legal issues and contracts. To cover costs, the ideal way would be to receive funding from institutions and organizations concerned with the mental well-being of people, in our case the social media users.

Therefore, we will first focus on getting the free service going and gathering a user base. Later on, we will analyze user’s behaviour and wishes to create additional paid features that can bring revenue.

References

[1]
Statista - Social media user statistics by age. Retrieved 27.11.2016 from https://www.statista.com/statistics/274829/age-distribution-of-active-social-media-users-worldwide-by-platform/
[2]
The Guardian - Ask.fm: Is there a way to make it safe. Retrieved 27.11.2016 from https://www.theguardian.com/society/2013/aug/06/askfm-way-to-make-it-safe
[3]
Huffington Post - What Constant Exposure To Negative News Is Doing To Our Mental Health. Retrieved 27.11.2016 from
http://www.huffingtonpost.com/2015/02/19/violent-media-anxiety_n_6671732.html
[4]
You-app - Small steps for happier healthier you. Retrieved 27.11.2016 from https://you-app.com/
[5]
Medical Daily - The Psychological Effect Of Bad News, And What You Can Do To Stay Positive. Retrieved 27.11 from http://www.medicaldaily.com/psychological-effect-bad-news-and-what-you-can-do-stay-positive-298084


No comments:

Post a Comment