Expert group ready to tighten social media regulations: It is time to regain control

Social media have had free rein in Denmark for a long time with far-reaching consequences. It is time to tighten regulations, says the government’s external expert group headed by CBS professor.

06/09/2023

Mikkel Flyverbom

If you are not paying for the product, you are the product.

This is a well-known phrase about social media, and it has become common knowledge that the business models of the tech giants behind social media are making money on our data.

But we should be allowed to say no to that now, says Mikkel Flyverbom, Professor at the Department of Management, Society and Communication at CBS and chairman of the government’s external expert group for tech giants, which was established last summer.

»We must be allowed to say no to data harvesting. We should not have to say yes to giving up on our own data to be able to use an app. We have to be able to actively opt in or out without being excluded,« Mikkel Flyverbom explains.

With the expert group, he has now presented four general recommendations which are to enhance the democratic control with the tech giants and reduce the negative consequences of using their social platforms.

Four new recommendations

The new recommendations for the government are all based in already implemented EU legislation, and they can be divided into four general categories:

  1. Data harvesting must be limited and made optional
  2. User retention mechanisms must be removed or limited
  3. Personal information must not be used for predicting behaviour
  4. Children and young people's use of social media must be regulated further

In this article Mikkel Flyverbom will go over the specific recommendations and emphasise why they are necessary.

Data harvesting must be limited and made optional

The rules currently stipulate that you must be 13 years old for a tech company to harvest your data via an app. Mikkel Flyverbom and the expert group wish to raise this to 16 years.

»There is no reason for them to have access to the data of children. We want parents to consent to data harvesting until the users have turned 16,« says Mikkel Flyverbom.

Moreover, users should also have the option to say no to data harvesting. Many apps today want us to say yes to giving up our data before we install them, but in future, we must be able to say no to data harvesting and still gain access to the app, says the expert group recommendations.

User retention mechanisms must be removed or limited

The so-called user retention mechanisms, such as notifications, autoplay, where videos start automatically, and streaks, where the user is expected to interact with the app daily, must be limited.

»We prefer the option to deactivate them completely. They are designed to keep us active and catch our attention, and they are only there to satisfy the tech companies’ own business models. Simultaneously, other negative consequences include spending more time on social media than we want to or being presented to more conflict-ridden material,« Mikkel Flyverbom says.

He suggests the introduction of a neutrality button that prevents content from being served by algorithms, which may tend to promote violent, conflict-ridden content and instead shows content in the order it has been uploaded.

»Clearly, the experience will be much duller, however, on the other hand it will help limit our time spent on social media. It should also be considered to limit the time children under a certain age are allowed to spend on social media. In China, children are only allowed on social media one hour per day. I am not sure if we need to have a limit this strict, but we can surely consider this,« says Mikkel Flyverbom.

Personal information must not be used for predicting behaviour

When you log on social media today, you allow the companies to access your personal information, which they use to target the ads that you see to exactly you.

However, the amounts of data we give up are also used to predict the content we are inclined to interact with, which may entail that the news we are presented with do not represent the overall news picture.

»At worst, it may imply that a person who has an interest in being fit may be introduced to content promoting anorexia and eating disorders. Or a suicidal person may be confronted with content about suicide,« explains Mikkel Flyverbom.

For this reason, the expert group recommends that the algorithms used by social media must be labelled.

»Right now, the algorithms used by social media are completely impenetrable with no access for researchers, supervisory authorities or media. We recommend that companies as a minimum must be able to document that the algorithms do not have a negative influence on our physical and mental health,« says Mikkel Flyverbom.

Children and young people's use of social media must be regulated further

The expert group finds that platforms with under-age content must have age restrictions.

»We know that children down to the age of 7 are on social media and other platforms even though these platforms show violence, porn, death and other things that children usually cannot access. Just as we do not allow a 7-year-old to buy a bottle of whisky, we should not allow them to log on to their phone and watch a man having his head cut off. That is why 7-year olds should not be on these platforms,« adds Mikkel Flyverbom.

A tool that makes it possible to prohibit everyone under 13 to log on to digital services is the European E-ID, which is currently being developed. It is similar to the Danish MitID and can confirm the age of the user and whether they should be granted access without having to share private information.

»Clearly, it will still be possible to bypass, but it requires a few extra steps. But just like a 7-year old child can get access to a bottle of whisky, we have to send the message that we do not want to allow this,« says Mikkel Flyverbom.

Tech giants will fight this to the hilt

The expert group recommendations have numerous demands for the tech giants which are in direct contrast to their business models, and a lot of the responsibility that will be assigned to them is not something they have had to worry about up to now.

Is there a concern that the tech companies will simply refuse to meet the demands?

»There is, but the recommendations are based on already adopted and coming EU legislation. The problem is that we need enforcement, and therefore we need to invest the resources required to fight against the tech giants. They have armies of lawyers who excel in deferring and setting aside fines, and they will fight this to the hilt. But I am optimistic,« says Flyverbom and continues:

»Technically, this is possible. We can demand this of the tech companies: If you want access to Danish and European consumers, you have to make this work.«

Digitalisation optimism has been prevailing in Denmark

Social media broke through in earnest in the 2000s and have had largely unregulated access to the data of the Danes since then.

Why does this happen now?

»Denmark is not a hardliner in this area. Generally, we have exuded a naive digitalisation optimism and we wanted to be digital frontrunners, which means that we have not had the necessary discussions about the consequences until now,« says Mikkel Flyverbom.

If you take a look around in Europe, Germany is more careful with their data protection, which is an approach that goes all the way back to the Second World War. France has a special focus on protecting citizens, their cultural heritage and the French language, so they have put up different barriers. Denmark has been more lost in digitalisation, says Mikkel Flyverbom, who hopes that Denmark can lead the way in other new ways.

»Denmark is a pioneer when it comes to the environment and the climate, and we can be a digitalisation pioneer too. But we have a problem with dissatisfaction with life and the lack of social relations, so we have to make an effort here.«

»We must protect our children and our democracy«

The four recommendations for the government aim to limit the negative impact that social media and their business models have on Danes and the Danish society.

Which consequences?

»We would like to keep our children from not thriving. We spend much energy on putting them in school, making them eat properly and getting them to exercise. Digital technologies should not conflict with this,« says Mikkel Flyverbom.

However, the problem that algorithms control what we see and glue us to the screen is even bigger than the declining mental health of children and young people. He elaborates:

»We have to be careful that the welfare society that we are invested in does not become undermined from the outside. Basically, it is about minding our democracy and our power of cohesion.«

Mikkel Flyverbom compares digitalisation with the industrialisation that turned society upside down in the 19th century.

»To a large extent, industrialisation was a development for the better. However, part of the effect was that children were doing hard physical labour for hours every day instead of going to school or playing, and factories poured chemicals directly into lakes and harmed nature. That part of the development was not necessary, and we worked through democratic institutions to stop child labour and reduce the negative impact on our nature. In the same way digitalisation must be democratically controlled, so we can avoid losing our grip on democracy and welfare.«

»It is important to keep in mind that the tech giants will not take care of these issues themselves. It is not the tobacco manufacturers who protect the consumers. It is not the fossil fuel manufacturers who look out for the climate. Other actors need to engage in these areas,« Mikkel Flyverbom concludes.

The external expert group of the government will support creating the framework for the tech giants and consists of:

  • Mikkel Flyverbom (chair), Copenhagen Business School
  • Lars Thinggaard, Tech for Life
  • Lone Sunesen, TV MIDT/VEST
  • Mie Oehlenschläger, Tech & Childhood
  • Miriam Michaelsen, Media Council for Children and Young People
  • Pernille Tranberg, DataEthics
  • Rebecca Adler-Nissen, University of Copenhagen
  • Rikke Frank Jørgensen, The Danish Institute for Human Rights
  • Sune Lehmann, Technical University of Denmark
  • Thomas Bolander, Technical University of Denmark
  • Peter Svarre, Digital Strategist, Speaker and Writer
The page was last edited by: Sekretariat for Ledelse og Kommunikation // 06/13/2023