Social media is no longer a static set of apps and behaviours. It is a fast-moving mix of law, platform policy, technology (especially AI), creator economics and child-safety standards.
Courses and guidance written in early 2024 are already missing critical changes that directly affect schools, teachers, safeguarding leads, parents, creators and organisations.
At SMACC – the Small Social Media and Content Creators Network, we exist specifically to track these changes in real time and translate them into practical, current guidance.
1. Regulation Has Moved From Theory to Enforcement
🇬🇧 United Kingdom – Online Safety Act
For years, training referenced upcoming regulation. That changed decisively over the last two years.
The UK Online Safety Act is now moving into active enforcement, with Ofcom publishing phased guidance covering:
- Illegal harms
- Child safety duties
- Age-assurance and age-verification expectations
- Platform accountability and fines
This means advice that once said “platforms may be required to…” is now outdated. Teachers and organisations need to understand what platforms must now do, and where responsibility still sits with users and educators.
🔗 https://www.gov.uk/government/publications/online-safety-act-explainer
🔗 https://www.ofcom.org.uk/online-safety
2. Europe Has Started Taking Action Against Platforms
🇪🇺 Digital Services Act (DSA)
The EU Digital Services Act is no longer just a legal framework — it is being actively enforced.
In the last two years:
- The European Commission has issued formal findings and investigations against platforms including TikTok, Meta and X
- Platforms are being challenged on algorithm transparency, researcher access, and child protection measures
- Penalties are now real, not theoretical
This directly affects:
- What content is recommended
- How misinformation spreads
- How children and teens experience feeds
🔗 https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package
🔗 https://www.reuters.com/technology/
🔗 https://www.lemonde.fr/pixels/
3. TikTok Has Become a Live Legal and Political Risk
TikTok has undergone more regulatory and political pressure in the last two years than almost any other platform.
Key developments include:
- US legislation requiring divestment or banning the platform
- Court challenges and shifting deadlines
- Increased scrutiny in Europe around data, AI use and child safety
For creators, schools and organisations, TikTok is no longer just “another app” — it is a risk-managed platform that requires up-to-date understanding.
🔗 https://www.reuters.com/world/us/
🔗 https://www.reuters.com/technology/tiktok/
4. Teen Safety Has Fundamentally Changed on Major Platforms
One of the biggest changes since early 2024 is that safety is now built into accounts by default, not left to optional settings.
Instagram Teen Accounts
Meta introduced Teen Accounts with:
- Private accounts by default
- Restrictions on who can message teens
- Content filtering and recommendation limits
- Parental approval for changes
This is a major shift in how young people experience social media — and something many older training materials do not mention at all.
🔗 https://about.instagram.com/blog
🔗 https://www.meta.com/gb/news/
5. AI, Deepfakes and Synthetic Media Changed the Rules
AI has moved from novelty to everyday creator tool — and with it came new risks.
Major changes include:
- Mandatory AI / altered content disclosures on platforms like YouTube
- New enforcement around deepfakes and non-consensual imagery
- Increased concern about realism, misinformation and trust
Training that does not include AI literacy is now incomplete.
🔗 https://blog.youtube/news-and-events/
🔗 https://www.reuters.com/technology/artificial-intelligence/
6. What Gets Recommended Has Changed
Platforms now actively shape what users see — especially around:
- Political and civic content
- Sensitive topics
- Youth-targeted feeds
Meta, for example, changed how political content is recommended on Instagram and Threads, altering the information environment for young people.
Understanding feeds vs follows, recommendation systems, and algorithmic nudging is now essential.
🔗 https://about.fb.com/news/
7. Creator Economics and Copyright Keep Shifting
In 2024 alone, creators saw:
- Major music catalogues removed from TikTok during licensing disputes
- Sudden changes to monetisation rules
- New restrictions on what audio and media can be used
Advice given one year ago about “what music is safe to use” may now be wrong.
🔗 https://www.reuters.com/technology/media/
Why This Matters for Education and Safeguarding
When guidance freezes in time, it risks:
- Teaching outdated safety advice
- Missing new platform defaults
- Underestimating legal responsibilities
- Leaving young people unprepared for how platforms actually work today
Social media education must be continuous, not static.
How SMACC Is Different
SMACC exists to:
- Track live platform changes
- Monitor UK, EU and global regulation
- Translate policy and tech changes into plain-English guidance
- Regularly update training for schools, teachers, creators and organisations
We do not publish once a year — we update as the landscape changes.
Final Thought
In social media, being slightly out of date is the same as being wrong.
Keeping up is no longer optional — it is essential.



This gave me a new perspective on tools I use every day without thinking about them.