14 Apr 2022
The BillAfter years of debate around how to police the internet, the proposed Online Safety Bill aims to introduce new laws to protect users in the UK from harmful online search results. The Bill requires that firms work to minimise harmful search results to their end users. This will have implications for firms whose sites host user-generated content: content that allow UK users to communicate through messaging, comments or forums – eg Facebook, YouTube or Twitter – and search engines, including Google or Bing.
These platforms will be required to remove illegal materials, for example content relating to terrorism, child sexual exploitation or self-harm. It also requires large social media platforms and search engines to maintain ‘proportionate systems’ to prevent fraudulent adverts being hosted on their service, in effect preventing scams from occurring before they hit financial institutions. With the increased use of these platforms, and a similar increase in number of scams, it is important firms act to protect their users. Be that on the platforms that are frequently accessed by children, to the big social media platforms where fraudulent adverts are seemingly part of our everyday life. A retired teacher signed up to a Bitcoin investment opportunity after seeing an advertisement on Instagram and endorsed by Bear Grylls, a survival and outdoor adventure TV personality. The retired teacher contacted the broker, checked out the website and invested £120,000 into buying and trading Bitcoin. But in fact, this money just went directly to the fraudsters. Whilst there are already some processes in place on platforms, including advanced verification checks or the reporting of scam ads by users, platforms will need to demonstrate that their systems are ‘proportionate’ at putting adverts under greater scrutiny. Ofcom will oversee and enforce the new rules, and they will determine what is ‘proportionate’.
The riskScammers post fake ads on these platforms with the aim to steal money or personal information from users. This can be as simple as an ad on Facebook promoting products that do not exist, or a paid-for investment advertisement that tricks users into thinking they can earn more money by ‘investing’ in something - that does not exist – using a bank transfer. This is known as an authorised push payment (APP) scam or bank transfer scam. It occurs when money is transferred directly from the victim to the bank account of a scammer. For example: Mr A clicks on a legitimate-looking investment advert on his social media. The advert promises attractive returns. Mr A invests a small amount. Soon after Mr A is contacted and told the investment has doubled. After the positive first experience, Mr A decides to invest more and more money in increasing quantities. The ‘broker’ then contacts Mr A to advise that the investment had crashed, and the funds have gone. Mr A never hears from the ‘broker’ again.Social media platforms are becoming ever more powerful and integrated with our daily lives. Protecting users from these types of scams should be a priority for these platforms, and the proposed Bill will force their hand.
With a significant increase in UK advertising revenue from the internet, and this forecast to continue according to Magna Global data on Bloomberg, the impact of this legislation will reach the masses.
There is a notable shift from newspaper advertising in the late noughties, with the top advertising revenue generators within the internet being searches and social media, which are set to continue as forecast by Magna Global. The overall intention of the Bill is to focus on these areas and prevent fraud before it hits financial institutions.
A survey from GWI found that the average number of social media platforms users aged 16 to 64 used worldwide was 7.5, and 6.3 in the UK. Average daily usage time was 2 hours and 27 minutes. This means that the exposure of just one advert scam on one platform can be vast.
With the introduction of 3D ads boosting monetization in gaming, the new legislation is also set to impact the gaming industry - the dominating gaming platform, Roblox, has reported daily active users totalling 45.5m alone.
According to Bloomberg, Meta’s daily active users has grown to 1.97bn in 2022, with advertising revenue at $28.15bn at the end of Q2. Twitter’s users grew to 237.80m, with an advertising revenue of $1.08bn.
Current protection for users/consumers
Whilst there is some protection already in existence, it is not law. Introduced in 2019, The Banking Code is voluntarily for banks and building societies. If a bank is signed up, it commits to taking steps to protecting customers by way of:
- identification of high-risk payments;
- delaying/preventing payments where there are concerns;
- preventing fraudsters opening bank accounts; and
- reimbursing authorised push payment (APP) scam victims, particularly where a bank has fallen short of the proactive standards.
UK Finance recorded 195,996 incidents of APP scams in 2021 and cited in their 2021 fraud facts report that £188.3m had been reimbursed to customers since the Code was introduced in 2019. Where the bank refuses to pay money to the customer, who may have been victim to this type of crime, the customer typically complains to the Financial Services Ombudsman.
The Online Safety Bill started life as a Green Paper in 2017, published as a draft bill in May 2021, and has seen many delays with progress to statute. The intention of the Bill would be to place a legal duty on tech companies to tackle the harm caused as a result of content hosted on their sites. Our new Prime Minister announced early September, that the Bill will be proceeding and would return to the House of Commons with some tweaks focusing on free speech.
When introduced, this statute may then also be further supported with the Financial Services and Markets Bill, intended to strengthen the financial services industry protection for customers. This Bill was laid before Parliament in July 2022 and contains measures to reimburse scam victims and protect access to cash. These pieces of legislation, if successfully adopted as law, are set to dislocate an approach taken by fraudsters. The focus is on disruption before the banks are involved and to encourage banks to proactively take steps to educate their customers to prevent victims making payments in an effort to reduce reimbursements.
Whilst the Online Safety Bill applies to immersive technologies, including the Metaverse, the Bill currently focuses on published user-generated content rather than activity. This would certainly capture the scam ads enticing users, or their avatars, to invest their cryptocurrency in real estate within the Metaverse, for example. But with the real-time activity afforded by this technology, the Bill does not so far capture behaviours or virtual conversations publicising the purchase of dubious real estate investment opportunities as an example. That is not to say fraudulent behaviour would not be covered by other legislation, but there may be a need to further adapt the Online Safety Bill to reach the government’s aim of making the UK the safest place in the world to be online.
For further information please contact Erin Sims.