Terms like "algorithm" and "recommender system" have entered the common lexicon in recent years, particularly in the area of online safety.
Campaigners say these systems, which push content into users' feeds, can be 'toxic' as they expose people to harmful material.
There have been calls for governments and regulators to force platforms to switch off algorithms but this is something that would be strongly resisted by social media companies.
The systems are a central part of their business models. They keep users on the platform for longer and allow tech firms to sell targeted advertising.
Social media companies argue that algorithms actually make their apps safer as they ensure a curated, more controlled user experience.
Media regulator Coimisiún na Meán has opened investigations into Meta relating to the recommender systems that promote content on Facebook and Instagram.
The investigations will assess if Meta has breached the EU Digital Services Act (DSA) amid concerns over the ability of users to easily change their feed preferences.
A recommender system feed based on profiling promotes posts, videos, products, or articles based on what the system thinks the user will like because of their previous online behaviour such as searches, clicks, likes and shares.
Under the DSA, users have a right to choose a recommender system feed which is not based on profiling and they must be able to do this in a direct and easily accessible way.
Coimisiún na Meán said it has concerns that that there may be so-called "dark patterns", or manipulative and deceptive designs, which could prevent people from accessing recommender system options.
Meta did announce a non-profiling option on Facebook and Instagram in 2023, in response to the DSA.
In a statement the company said it disagrees with any suggestion that it has breached the Digital Services Act.
Meta said it has introduced substantial changes to its processes and systems to meet its regulatory obligations, and that it will engage with Coimisiún na Meán to share details of this work.
The investigations announced today come amid a push from regulators in Europe and prosecutors in the US to focus on the design of social media apps rather than the content posted on them.
For years, tech companies have been shielded from prosecution over the material they host.
It is a different story when it comes to the building blocks of the systems they rely on.
In March, a California jury found that Meta and Google were liable for designing platforms that are addictive.
Both companies said they disagreed with the verdict and are appealing.
Read more: Media regulator to probe Meta over recommender systems
A day earlier, a jury in New Mexico found Meta liable for misleading users over the safety of its platforms for children.
The company was ordered to pay $375 million but Meta has vowed to appeal.
In February, the European Commission accused TikTok of creating an "addictive design" in its app which could harm the physical and mental wellbeing of minors and vulnerable adults.
TikTok rejected the findings claiming they presented a "categorically false and entirely meritless depiction" of its platform.
If Coimisiún na Meán finds that Meta is in breach of the DSA, the company could be hit with a massive fine of up to 6% of its turnover.
Meta has deep pockets however, and critics of online regulation say fines are not an effective deterrent.
Many would rather see meaningful changes to the algorithms that have dominated so much of the online safety debate in recent years.