Interviews, insight & analysis on digital media & marketing

The importance of building a compliant social business from the ground up

By Michael Collett, CEO and Co-founder of Kin

Social media has become an increasingly integral part of our everyday lives and an indispensable form of communication. With an estimated 4.2 billion users across social media platforms, it is now estimated that around 87% of UK teens aged 12-15 have a social media account.

Whilst social media platforms provide a great platform for creativity, activism, and keeping us connected to friends and family, they also have a dark side. Issues such as cyberbullying, addictive use, fake news, trolling and  anxiety are just some of the problems major social media platforms are accused of either fostering, or not doing enough to prevent. In response to the risks online, conversations surrounding regulation and social media compliance are increasingly gaining traction. Despite social media platforms largely being left to self-regulate, one study found that 74% of the UK did not trust social media companies alone to decide what is extreme content or disinformation when it appears on their platforms.

With children and teens particularly vulnerable to the harms of social media, UK regulator The Information Commissioner’s Office (ICO) has recently implemented increased regulation, Childs Code, to protect children online. Introduced in September 2021, the new regulation will now limit companies from tracking the location of children, personalising content or advertising directly to them, and serving up behavioural nudges, such as automatically playing videos. Apps that fail to comply with these new rules and demonstrate that their apps and platforms comply with the regulation are at risk of being fined as much as £17.5 million.

Young people deserve a safe space online where they can interact with family and friends without any of the risks associated with social media platforms. The new regulation sets out to make these spaces, namely apps, online games and web and social media sites, safer and more reliable for children. However, creating a compliant social media business can be easier said than done.

How major networks have had to adapt 

Major social media platforms have previously come under fire for their policies, or lack thereof, for children on the platforms. One example of this is TikTok. The European Consumer Organization, BEUC, has previously complained that TikTok allows advertisers to use hidden advertising on the platform, such as branded hashtag challenges that encourage the creation of content. The organisation also says that the app serves up “inappropriate content such as videos showing suggestive content which are just a few scrolls away” to its non-adult users.

Facebook has also been criticised in a recently published research for allowing advertisers to target young users with age-inappropriate ads such as smoking, vaping, alcohol, gambling and extreme weight loss.

Major social media platforms are making a change though. In response to the Child’s Code, some have had to alter the way their businesses operate. For example, TikTok has turned off notifications for children past bedtime, while Instagram has disabled targeted adverts for under-18s entirely as well as defaulting teens to private accounts. YouTube has also announced plans to turn off autoplay for teen users as well as “gradually” start adjusting the default upload setting to the most private option for users ages 13 to 17,  limiting the visibility of videos only to the users and those they directly share with, not the wider public.

However, an interesting yet worrying example of the complexities of social media regulation, can be found on Facebook’s recent changes to Instagram. Instead of targeting specific users based on their interests gleaned via data collection, advertisers can now only broadly reach young people by focusing ads in terms of age, gender and location. However, Facebook has admitted there is “no foolproof way to stop people from misrepresenting their age” when joining Instagram or Facebook. The apps ask for date of birth during sign-up, but have no way of verifying responses.

Moreover, some platforms have failed to introduce any policies at all. For example, Twitter has previously declined to answer questions about how it had approached complying with the new code. Nevertheless, ICO’s executive director, Stephen Bonner, has praised the “significant changes” made by major platforms, taking credit for the new compliance rules.

How new business can learn from past these mistakes

Compliance can be complicated and varies between industries. However, put simply, social media compliance means following the rules when engaging with, or facilitating, public interactions on a platform. Despite some ‘self-governing’ platforms working to remove harmful content found on their platforms, many governments around the world, including, USA, Germany, Brazil, Bahrain, Hong Kong, India, Switzerland, have seen fit to implement some form of regulatory action to address the issue at a policy and judicial level.

While all businesses should monitor compliance with their own corporate policies and standards, they must also follow government regulations when it comes to social media and digital marketing. Whilst it can be argued there should be a global approach to social media regulation due to the industry’s international nature, all social media platforms should be aware of the rules and regulations of the countries in which they operate – being ‘international in nature’ does not excuse them from in-market compliance. For example, in the UK, in addition to the new Childs Code,  platforms need to comply with the draft Online Safety Bill, published in May 2021, which imposes a “duty of care” on social media companies, and some other platforms that allow users to share and post material, to remove “harmful content”.

Additionally, it is important that new social media platforms learn from the past mistakes of social media giants. For example, features such as targeted advertising need to be curtailed through comprehensive regulation which either prohibit or require companies to give users the choice of opting out from targeted advertisement. On the other hand, some businesses are switching models in order to keep their platforms compliant at the same time as protecting users. Targeted ads are perhaps one of the most talked about issues with social media platforms, and a particular area of concern. Customer data is big business, and so it is common practice for businesses to collect quantitative and qualitative consumer data daily to understand user behaviour or to sell to third parties to create targeted ads. In response, private and ad-free social media networks are increasingly becoming the norm; as by using subscription-based models, platforms won’t sell user data for marketing purposes, ensuring personal information is truly safe and secure.

A variety of software and tools also exist to help with managing compliance. Sites such as ZeroFOX and Hootsuite can help to keep brands compliant in several ways. ZeroFOX monitors accounts and sends alerts if it identifies non-compliant content, while Hootsuite, among several other things, allows you to create custom access permissions for employees for social content creation.

Social media compliance is now a balancing act between businesses’ own self-governing policies and government regulation. Adhering to existing regulations is extremely important but it is also the bare minimum. In order to truly enact change and create a safer space for users, social media platforms must implement internal policies that genuinely protect the interests of people, not just profit. Above all, they must hold themselves accountable when mistakes are made and work to find a solution – not doing so can have long lasting impacts on a business as  users and employees increasingly hold companies to account for their actions. For example, Facebook has recently come under fire after a whistleblower came forward to the US congress claiming the company  continually pursued its profits over users’ and wider public welfare interests.

Social media companies now must make the decision of whether to continue to operate without limitations and risk increased government intervention, or whether to implement more aggressive self governing policies and moderation in order to keep their users safe. Social media companies can no longer keep their head in the sand, and they must go beyond regulations to ensure they are acting ethically and safely.