Sunday, December 11, 2016

Final video and report


Negativity online - what is it and why is it a problem?

Nowadays, a large portion of the content posted and shared on social media presents negative news and ideas. The rants, arguments and insults that are common in the comment sections only add to the negative ambience online. Yet, sometimes a good news headline, or constructive discussions tend to pop up. But such positive content is shared on a miniscule level as compared to the negative content on social media.

Due to this situation, users of social media may feel that they are overwhelmed with negativity. This bombardment of continuous negativity affects their life, especially their mental well-being. It may cause many problems including depression, anxiety, stress and even post-traumatic stress disorder (PTSD) [3, 5].

In addition, there is a public sentiment that social media is increasingly becoming negative. Public discussions have been sparked by some extreme cases. It is often seen that content with a negative sentiment attracts attention and is propagated quickly through social media [6].



Everyday people on the Web get affected by negative content. As a result, their frustration, sadness, and sometimes annoyance are reflected in the form of many public discussions in social webs, blogs and forums. This situation can be illustrated by the blog post “What are all these violent images doing to us?” by Jay Ulfelder which caused a large discussion [7]. The following is an extract from his blog post:



Early this morning, I got up, made some coffee, sat down at my desk, and opened Twitter to read the news and pass some time before I had to leave for a conference. One of the first things I saw in my timeline was a still from a video of what was described in the tweet as an ISIS fighter executing a group of Syrian soldiers...
That experience led me to this pair of tweets:
However, such situation is typical not only for Twitter, but for all social media platforms including but not limited to Facebook, Instagram, Snapchat, Whatsapp. Therefore, everyone active on the internet and social media is affected by negative content to some extent. The effect that negative content produces on different people may differ: while some can easily move on, others are strongly affected by what they see and read or by the behaviour they experience from others.

Our group and goals

We are a diverse and a multicultural group of international students from Vietnam, India, Russia, Romania. This enables us to understand the problem in different viewpoints.
We, as a group, value constructive discussions, the right to criticize, but not bully, and positivity online. We believe everyone should have a right to control the level of negativity around them. Consequently, one has a right to consume the type of content they want.  

Our goal is to reduce negativity on social media and promote positive content. Examples of this ‘negativity’ include repetitive negative news, uncensored brutal images, graphic stories and other materials that create negative feelings for the viewer.

As we are dealing with negativity online, in itself something that is difficult to quantify, we feel we cannot provide any clear-cut numeric way to measure our success. However, we have found the following metrics appropriate to understand the impact of our service:

  • User feedback: do the users feel they can reduce the negativity in their online life by using our solution? We can allow them to easily send us feedback and periodically ask for their options.
  • Tracking the number of positivity/negativity related hashtags
  • Tracking the number of our service users, which if increasing, is a success

Background

Stakeholders of creating and promoting negative content


We have tried to describe the stakeholders of content posted on social media in the following image:


This is a logical grouping, based on the roles people take when it comes to creating and spreading content on social media. All the groups influence the content that are allowed or that become popular. Furthermore, each group has some role in making any content popular. There could be groups that are opposite to each other. For example, content consumers can be grouped into two opposing groups, where one has constructive interactions, and the other group often vandalizes. Moreover, we have the groups “Enablers” and “Policing/Admins” which have a major role affecting how content gets spread and have control over content consumers.

The grouping presents several advantages when considering our task, attempting to reduce negative content and behaviours. Firstly, it helps us figure out the details of the societal problem we are focusing on and ‘who’ is involved in perpetuating the problem. Secondly, it helps us understand roles in creating and distributing content and how the representatives of these roles interact with each other. Overall, we are looking to provide a win-win solution to every stakeholder and having a clear understanding of the actors will help us achieve our goal.

Current initiatives and solutions

Several social media platforms allow for mechanisms to report activities such as obscene and inappropriate content. Such content gets removed only when people report it and it comes to the notice of authorities. Till that time, people are still exposed to negative content.

Some people on social media are trying to promote positive content in order to cater to those who seek for such content voluntarily. These services are available as pages and groups (on Facebook or similar websites) or full fledged websites, which often post only positive content and good news. A few examples of such services include - Brightside, The Better India, etc.

There are a some services that try to create a completely new social network based on social psychology principles to create a positive social media environment. Such services ways to instill positivity based on credible social psychology research. One such example is the YOU-app [4]. It is a completely new social network which promotes positive actions and mindfulness.   

Restrictions and limitations

We have considered the following restrictions and limitations for our solution:

Social media relies heavily on getting attention from users - There is a reason negative content spreads so rapidly on social media. Negative content quickly catches people’s attention and the current style of doing business online is all about attention. In other cases, instead of doing something in reality (eg. donating food, money, etc.), some people resort to sharing negative content (e.g. brutal images of war, dead animals) by considering it as a way to show others that they are active and doing something to improve a situation [7]. It would be wrong to remove negative content totally because it is crucial to know the important happenings around the world. What we are trying to do is give people an option for having less negativity thrown at them, is they choose so.

Legal issues - As we are planning to influence the content the users will see, we must carefully consider all the legal ramifications.

Schedule - Since we do not have an opportunity to develop this service in this course, we are only going to come up with a concept of the service.

Creation of information bubbles - One issue that arises from blocking negative content is the creation of information bubbles for the user. Exposing oneself to only a certain type of content may have a skewed view of the world. We are presently considering ways to avoid these information bubbles. For example, the user could be presented with the content once, and then choose to hide it in the future rather than preemptively hiding all content on a topic. We are trying to tackle this problem by blurring the content instead of totally hiding it so that an interested user has a chance to see the content. Thereafter the user can chose to not see the content anymore.

Security and privacy - The Echo add-on is a middleware and has access to the content the user accesses. Therefore, we must be able to gain the user's’ trust in order for them to install our add-on. For this we need promise that we respect the user’s privacy and securely store any data with us. Furthermore, the add-on should also have functionality to disable it on certain pages.

Solution concept and approach

Our solution

We have named our solution “The Happy Echo”. The idea is to filter negative content from the user’s display based on his/her preferences.


The Happy Echo is a service that would work with social media accounts, in browsers or in apps, to curate and display the content according to the users’ wishes. The service would act as a middleware between the content and the user. When the user is browsing social media websites, then the content would first go through our service for it to be analyzed and modified according to the user’s preferences before being displayed to the user. The following image is an detailed representation of the service:


The most important features of the service are be as follows:

  • Filtering - Based on the preferences, the service can hide or suppress negative content. The level of filtering will depend on the user preferences set in the add-on.
  • Crowdsourcing - Each post that is seen by the user will have an option to be marked as negative. For example, if the post is a very brutal image, then the user will most likely rate the image as negative. This way, we will collect vital crowdsourced data on the content that is out on social media. Then based on the user’s settings and the crowdsourced data, the add-on will filter out the required content.
  • Settings -  The most important aspect of any service is its flexibility to adapt to user preferences. The user should be able to configure the service properly. These configuration settings could include settings for - the level of filtering, the level of information access available to the service, etc.


We have created wireframes to describe the minimum viable product. The following is a walkthrough on how the add-on works:

  1. The user installs the add-on and sets his preferences in the settings window.





We have considered three levels of filtering:
  • Extremely disturbing content - 85-100% of users that have seen it previously have chosen not to see it again
  • Very disturbing content - a large majority (70-85%) of users that have seen it before have chosen not to see it again
  • Disturbing content - over 50% of users have marked it as disturbing. However, that number is not very high and there is a strong possibility the content may only mildly uncomfortable for many people


  1. Each post the user sees has a option to flag content as negative. This flag is sent to our server and is merged with the crowdsourced data for the post to calculate what percentage of those who have seen a post have marked it as negative. Once a post is flagged, the user will not see it again.



  1. Based on the user’s settings, their own ratings, and the crowdsourced data, the negative content is blurred on the display. If the users want to see the filtered negative content, they need to click on the blurred content.


Who decides what is negative content?

It is important to tell the difference between what classifies as a negative content and what should be considered positive content. For that matter, we rely on crowdsourced ratings of posts on social  media by users who have our add-on installed. If a user rates a post as most negative, it would not be shown to him in the future, and the rating will be sent to our servers in the pool of crowdsourced ratings. Thus we rely on the community to decide what is positive and what is negative by rating and flagging negative content. Furthermore, the users have a provision to flag some content not to be shown to him in the future which they consider negative.

As a result, we believe that users will be exposed to less negative content and have easier access to positive content. In turn, this will create less stress and healthier content consumers. In addition, the solution would work with the existing social media accounts, would be easy to integrate and use. Most importantly, the service could help to deliver content based on one’s clearly defined preferences.

What makes us different?

We are trying to fix negativity in existing social media channels instead of creating a new social network. It is difficult for users to switch over to use a completely new social network. More importantly, the users themselves do not want to usually change after getting accustomed to a dominant social media platform for a long time. Therefore, we think that it is best we try to improve the existing setup. Apart from that, we are also trying to give users a full control over the content they consume. The user will have a full control over their preference settings over what content to be filtered and what should be displayed.

Initial target group

The target group for us currently is students. Younger people are more active on the Internet and use several social media platforms often. Statistics say that around 25% of Facebook’s users are in the age group of 18-24, the usual student age. Other services like Instagram, Snapchat, etc. possibly have similar statistics [1]. We have selected students because they are more open minded and open to a change. Students often try out new technologies which would help us in early adoption of our concept.

Since we have selected “Students” as our target group, it is still a huge area. As a preliminary target market (for the first release), we will chose Aalto University Students. Thus, it would be easier to get started since students from the same university are more like to try out the solution and give feedback for improvement.

Ideally, the solution could be used by any social media user in need of it, regardless of age, gender, social category etc.

Initial tasks and business model

The minimum viable product is a browser extension that acts as a middleware between the users and the content they consume. Later this would be expanded to include other technology platforms like mobile, smart devices, etc. The functionality of the minimum viable product will be the one previously described through the wireframes.

As our goal is to solve a problem that exists in society, therefore, we believe at least the minimum functionality should be provided free of charge for the end users. Depending on user needs and the response to our solution, we can introduce more advanced features for a small fee later. One such advance feature could be providing custom built preference packs to control the type of content to be filtered. For example, a teenager’s preferences pack that filters content inappropriate for teenagers.

The main cost will be the development of the service and hosting the back end on a web server. Additional costs can come from, for example, marketing, handling of legal issues and contracts. To cover costs, the ideal way would be to receive funding from institutions and organizations concerned with the mental well-being of people, in our case the social media users. Otherwise, donations and crowdfunding campaigns could bring in the required funds.

We will first focus on getting the free service running and gathering a user base. Later on, we will analyze user’s behaviour and wishes to create additional paid features that can bring revenue.

References

[1]
Statista - Social media user statistics by age. Retrieved 27.11.2016 from https://www.statista.com/statistics/274829/age-distribution-of-active-social-media-users-worldwide-by-platform/
[2]
The Guardian - Ask.fm: Is there a way to make it safe. Retrieved 27.11.2016 from https://www.theguardian.com/society/2013/aug/06/askfm-way-to-make-it-safe
[3]
Huffington Post - What Constant Exposure To Negative News Is Doing To Our Mental Health. Retrieved 27.11.2016 from
http://www.huffingtonpost.com/2015/02/19/violent-media-anxiety_n_6671732.html
[4]
You-app - Small steps for happier healthier you. Retrieved 27.11.2016 from https://you-app.com/
[5]
Medical Daily - The Psychological Effect Of Bad News, And What You Can Do To Stay Positive. Retrieved 27.11 from http://www.medicaldaily.com/psychological-effect-bad-news-and-what-you-can-do-stay-positive-298084
[6]

Stieglitz, Stefan, and Linh Dang-Xuan. "Impact and Diffusion of Sentiment in Public Communication on Facebook." ECIS. 2012.
[7]
Jay Ulfelder (2014). What are all these violent images doing to us?
Retrieved 11.12 from https://dartthrowingchimp.wordpress.com/2014/08/29/what-are-all-these-violent-images-doing-to-us/


Sunday, November 27, 2016

Draft Report


Negativity online - what is it and why is it a problem?
Every day we wake up to see at least a couple of bad news which show up first on either Facebook or news websites. Checking the comments or discussions on social media, we see a lot of people who hurl abuses and insult each other. Result? A demoralized person.

Nowadays, a large portion of the content posted and shared on social media presents negative news and ideas. The rants, arguments and insults that are common in the comment sections only add to the negative ambience online. This bombardment of continuous negativity affects the users of social media, especially their mental well-being. Yet, sometimes a good news headline, or constructive discussions tend to pop up. But such positive content is shared on a miniscule level as compared to the negative content on social media. Many people nowadays feel that they are overwhelmed with negativity. This may cause many problems including depression, anxiety, stress and even post-traumatic stress disorder (PTSD) [3, 5], as negativity strongly affects the mental well-being of a person and decreases the quality of one’s life.

There is a public sentiment that social media is increasingly becoming negative. Moreover, rude behavior and online fights are a commonplace. Public discussions have been sparked by some extreme cases.

For example, several online bullying cases have surfaced on Ask.fm, a social networking website to ask questions. Several people have been victims of anonymous threats of violence, rape and death leading to tragedy. Below is an example of a victim of cyberbullying who committed suicide.

Hannah Smith, aged 14, was found hanged in her bedroom in Lutterworth, Leicestershire, by her sister Jo, 16. Her father, Dave, said she had been savagely bullied on the question-and-answer website Ask.fm.
"I have just seen the abuse my daughter got from people on Ask.fm and the fact that these people can be anonymous is wrong," Smith, 45, wrote on his Facebook page.
(From: The Guardian - Ask.fm: is there a way to make it safe [2])

Cyberbullying and other negative behaviour exists not only on Ask.fm on all social media platforms including but not limited to Facebook, Twitter, Instagram, Snapchat, Whatsapp etc.

Everyone active on the internet and social media is affected by negative content to some extend. At some point, everyone comes across rude behaviour, cyberbullying, and bad news. The effect that negative content produces on different people may differ: while some can easily move on, others are strongly affected by what they see and read or by the behaviour they experience from others.

Our group and goals

We are a diverse and a multicultural group of international students from Vietnam, India, Russia, Romania. This enables us to understand the problem in different viewpoints.
We, as a group, value constructive discussions, the right to criticize, but not bully, and positivity online. We believe everyone should have a right to control the level of negativity around them. Consequently, one has a right to consume the type of content they want.  

Our goal is to reduce negativity on social media and promote positive content. Examples of this ‘negativity’ include hate speech, rude comments, repetitive and useless bad news, uncensored brutal images, etc.

As we are dealing with negativity online, in itself something that is difficult to quantify, we feel we cannot provide any clear-cut numeric way to measure our success. However, we have found the following metrics appropriate to understand the impact of our service:

  • User feedback: do the users feel they can reduce the negativity in their online?
  • Tracking the number of positivity/negativity related hashtags
  • Tracking the number of our service users, which if increasing, is a success

Background

Stakeholders of creating and promoting negative content


We have tried to describe the stakeholders of content posted on social media in the following image:


This is a logical grouping, based on the roles people take when it comes to creating and spreading content on social media. All the groups influence the content/behaviours that are allowed or that become popular. Furthermore, each group has some role in making any content popular. There could be groups that are opposite to each other. For example, content consumers can be grouped into two opposing groups, where one has constructive interactions, and the other group often vandalizes. Moreover, we have the groups “Enablers” and “Policing/Admins” which have a major role affecting how content gets spread and have control over content consumers.

The grouping presents several advantages when considering our task, attempting to reduce negative content and behaviours. Firstly, it helps us figure out the details of the societal problem we are focusing on and ‘who’ is involved in perpetuating the problem. Secondly, it helps us understand roles in creating and distributing content and how the representatives of these roles interact with each other. Overall, we are looking to provide a win-win solution to every stakeholder and having a clear understanding of the actors will help us achieve our goal.

Current initiatives and solutions


Several social media platforms allow for mechanisms to report activities such as cyberbullying, obscene content, etc. Such content gets removed only when people report it and it comes to the notice of authorities. Till that time, people are still exposed to negative content.

Some people on social media are trying to promote positive content in order to cater to those who seek for such content voluntarily. These services are available as pages and groups (on Facebook or similar websites) or full fledged websites, which often post only positive content and good news. A few examples of such services include - Brightside, The Better India, etc.

There are a some services that try to create a completely new social network based on social psychology principles to create a positive social media environment. Such services ways to instill positivity based on credible social psychology research. One such example is the YOU-app [4]. It is a completely new social network which promotes positive actions and mindfulness.  

Restrictions and limitations


We have considered the following restrictions and limitations for our solution:

Style of current business - There is a reason negative content spreads so rapidly on social media. Negative content quickly catches people’s attention and the current style of doing business online is all about attention. Unfortunately, we cannot remove negative content, nor can we stop negative behaviours. What we are trying to do is give people an option for having less negativity thrown at them, is they choose so.

Legal issues - As we are planning to influence the content the users will see, we must carefully consider all the legal ramifications.

Schedule - Since we do not have an opportunity to develop this service in this course, we are only going to come up with a concept of the service.

Creation of information bubbles - One issue that arises from blocking negative content is the creation of information bubbles for the user. Exposing oneself to only a certain type of content may have a skewed view of the world. We are presently considering ways to avoid these information bubbles. For examples, the user could be presented with the content once, and then choose to hide it in the future rather than preemptively hiding all content on a topic.

Solution concept and approach

Our solution


We have named our solution “The Happy Echo”. The idea is to filter negative content from the user’s display based on his/her preferences.


The Happy Echo is a service that would work with social media accounts, in browsers or in apps, to curate and display the content according to the users’ wishes. The service would act as a middleware between the content and the user. When the user is browsing social media websites, then the content would first go through our service for it to be analyzed and modified according to the user’s preferences before being displayed to the user. The following image is an detailed representation of the service:


The most important features of the service are be as follows:

  • Filtering - Based on the preferences, the service can hide or suppress negative content. The filtering can be based on, for example, user generated keywords, or machine learning algorithms.
  • Re-sorting and highlighting - The content on a page can be re-sorted or highlighted on the page so the user can view the positive content more easily than the negative content.
  • Tracking -  The service shall be able to be configured to track user behaviour and actions and then make suggestions based on that. For example, if the user is being shown too much negative content, the service should be able to detect that and suggest similar but positive content.
  • Settings -  The most important aspect of any service is its flexibility to adapt to user preferences. The user should be able to configure the service properly. These configuration settings could include settings for - the level of filtering, adding keywords, re-sorting and highlighting, the level of information access available to the service, etc.

With the rise in the efficiency of machine learning and artificial intelligence algorithms nowadays, it is imperative to make use of it. The service should be able to tell the difference between what classifies as a negative content and what should be considered positive content. For that matter, there should also be a provision for crowdsourced training of the service in order to allow the user of the service to flag negative content. Based on intelligent algorithms and user flagging, the service could be much more efficient in detecting negative content.

As a result, users will be exposed to less negative content and have easier access to positive content. In turn, this will create less stress and healthier content consumers. In addition, the solution would work with the existing social media accounts, would be easy to integrate and use. Most importantly, the service could help deliver content based on one’s clearly defined preferences.

What makes us different?


We are trying to fix negativity in existing social media channels instead of creating a new social network. It is difficult for users to switch over to use a completely new social network. More importantly, the users themselves do not want to usually change after getting accustomed to a dominant social media platform for a long time. Therefore, we think that it is best we try to improve the existing setup. Apart from that, we are also trying to give users a full control over the content they consume. The user will have a full control over their preference settings over what content to be filtered and what should be displayed.

Initial target group


The target group for us currently is students. Younger people are more active on the Internet and use several social media platforms often. Statistics say that around 25% of Facebook’s users are in the age group of 18-24, the usual student age. Other services like Instagram, Snapchat, etc. possibly have similar statistics [1]. We have selected students because they are more open minded and open to a change. Students often try out new technologies which would help us in early adoption of our concept.

Since we have selected “Students” as our target group, it is still a huge area. As a preliminary target market (for the first release), we will chose Aalto University Students. Thus, it would be easier to get started since students from the same university are more like to try out the solution and give feedback for improvement.

Ideally, the solution could be used by any social media user in need of it, regardless of age, gender, social category etc.

Minimum viable product and initial business model


The minimum viable product is a browser extension that acts as a middleware between the users and the content they consume. Later this would be expanded to include other technology platforms like mobile, smart devices, etc.

As our goal is to solve a problem that exists in society, therefore we believe at least the minimum functionality should be provided free of charge for the end users. Depending on user needs and the response to our solution, we can later on introduce more advanced features for a small fee. One such advance feature could be providing custom built preference packs to control the type of content to be filtered. For example, a children's preferences pack that filters content inappropriate for children.

The main cost will be the development of the service. Additional costs can come from, for example, marketing, handling of legal issues and contracts. To cover costs, the ideal way would be to receive funding from institutions and organizations concerned with the mental well-being of people, in our case the social media users.

Therefore, we will first focus on getting the free service going and gathering a user base. Later on, we will analyze user’s behaviour and wishes to create additional paid features that can bring revenue.

References

[1]
Statista - Social media user statistics by age. Retrieved 27.11.2016 from https://www.statista.com/statistics/274829/age-distribution-of-active-social-media-users-worldwide-by-platform/
[2]
The Guardian - Ask.fm: Is there a way to make it safe. Retrieved 27.11.2016 from https://www.theguardian.com/society/2013/aug/06/askfm-way-to-make-it-safe
[3]
Huffington Post - What Constant Exposure To Negative News Is Doing To Our Mental Health. Retrieved 27.11.2016 from
http://www.huffingtonpost.com/2015/02/19/violent-media-anxiety_n_6671732.html
[4]
You-app - Small steps for happier healthier you. Retrieved 27.11.2016 from https://you-app.com/
[5]
Medical Daily - The Psychological Effect Of Bad News, And What You Can Do To Stay Positive. Retrieved 27.11 from http://www.medicaldaily.com/psychological-effect-bad-news-and-what-you-can-do-stay-positive-298084