Prime Minister Kyriakos Mitsotakis participated in a conversation with Baroness Beeban Kidron, Member of the House of Lords of the United Kingdom, in the context of “The Lyceum Project 2025 – Children in the age of AI” conference, held at the Athens Conservatoire. The discussion was moderated by Dr. Konstantinos Karachalios.
The Prime Minister’s remarks follow:
In his introductory remarks, the Prime Minister stated:
Well, first of all, thank you so much for inviting me. My congratulations to the World Human Forum and to “Demokritos” for making the Lyceum Project a yearly meeting point to discuss extremely thought-provoking but also complicated issues that connect technology to humanities, to philosophy.
It was from the beginning our belief, within our AI strategy, that Greece can play a thought leadership in connecting ancestral intelligence, as Alexandra very rightly coined the term, with artificial intelligence to understand what we can actually learn from ancient philosophy and wisdom in addressing what is a profound and, in my mind, an unprecedented challenge. And I’m referring to the rise of AI in general.
For the first time we humans no longer have the monopoly on intelligence. This is a profound change from any previous technological revolution that has taken place. It questions the whole notion of humanity, of consciousness, of agency. In that sense, it is not just a technological evolution or even a revolution. It’s something, in my mind, completely different, qualitatively different.
Now, back to the topic that we’re discussing, I’m really privileged to share the stage with the Baroness. She has done an amazing job, both in terms of educating the public, but also coming up with real suggestions on how to address a real problem.
And what is the real problem? The real problem is exactly that in the digital world we no longer treat children as children. We treat them as adults, while recognising that they are more vulnerable than adults in terms of patterns of behaviour which are well-established by science.
I’ve made this point numerous times, that we’re running an unprecedented experiment, a global experiment, with the mental health of our children and our teenagers. As far as we know, from the scientific evidence that we have, this is not going to turn out well. We already have enough science to understand that something is going wrong with our children, with their mental health.
When we speak to parents about this, they recognise it immediately. I was pleasantly surprised but also shocked by the amount of support we got from parents when we started speaking about this topic, because parents need to be part of the equation. In a sense we understand something is happening. We don’t want our kids to be glued to a screen, but we don’t know how to deal with the problem.
So there’s no doubt that we have a problem and there’s no doubt that the root cause of the problem is related to the fact that the content that is consumed by children and teenagers is designed in such a way to have our kids and our teenagers spend as much time glued to a specific application, as much as the tech companies can, because they make money out of it. It’s as simple as that.
The models and the algorithms are designed to promote addiction. In the same way, and I apologise for being slightly provocative here, that tobacco companies knew many decades ago that smoking was addictive. They knew it, yet they made it very clear that they wanted to prolong this for as long as they could. I think these are facts that we already know.
As responsible leaders, I feel that we have to do something about it. What is it that we have done in Greece? The first thing that we’ve done is to openly speak about the issue, engage with parents, and try to explain to them that already within the given technological world there are tools that they can use to limit the access that their kids have to their smartphones and to specific applications.
The second thing that we did was try to overcome what the tech companies always tell us is the big problem, that “we don’t know the real age of the user”. How do we get the proper age verification tool in order to define who is a digital adult and who isn’t? This is also an interesting discussion that we need to have.
Well, we said, okay, we can solve this through an application, we call it “Kids Wallet”. We will simply connect this application to our digital registry, and then it will be easy to verify the exact age of any user. I cannot believe that today we are still hiding behind this very flimsy excuse that “we don’t know the age of the users”, when technology has advanced so far. And the “Kids Wallet” also gives, in a simplified way, more control to the parents over the applications themselves, from the parents.
Again, it is impossible to do this without involving and educating the parents, because it becomes a “collective action” problem. Which parent is going to tell his kid “I don’t want you to spend more time on your phone” if everybody else is doing it. So, we need to address the problem in its entirety.
Then, of course, the next stage is how do we turn this into more thoughtful or advanced European regulation. Because as a medium-sized country there’s only that much we can do. And how far should we take this discussion? Should we take it as far as a complete ban on social media under a certain age? Is this something which is realistic? Is this a real possibility?
It’s interesting that China, which is so advanced in technology, they have put their own restrictions on social media, and they’re quite draconian. So they understand that they don’t want their children engaged. Do we have time restrictions, for example, which are designed as an interim step: “you can only spend x amount of time on any given application per week”?
I fundamentally believe that raising children is about boundaries. And there are no boundaries now in the digital world. It’s sort of strange, going back to the argument made by Jonathan Haidt in his very, I think, important book, “The Anxious Generation”, that we don’t allow our kids to play or encourage them to be outside. We are scared of unsupervised play. We track our kids everywhere they go, through our nice tracking applications, because we want to feel safe. Because we think we protect them this way.
But we do almost nothing when it comes to the digital world, in terms of what content they consume, in terms of violence, in terms of extreme and aggressive pornography, in terms of little girls being exposed to AI models as models of beauty which are completely unattainable. About that we do almost nothing.
So, we really want to be leaders in this topic. I want to be a leader at the European level. When I speak to my European colleagues, I tell them we need to pick our regulatory battles in a smart way. We cannot regulate everything as Europe. But what is it that we should really be concerned about? We need to recognise that innovation and regulation don’t always go hand in hand. But let’s pick those battles that are important to us.
And for me, there are two battles. The first is about democracy and how technology is actually interfering with the democratic debate. And the second is the protection of our kids and our teenagers. These are the two battles I’ve decided to pick.
And as far as I know, I was the only leader who spoke about this at the UN General Assembly last year. And I intend to speak again about it, because I think it is a global problem.
I think we also need to tell the technology companies in no uncertain words: Look, you’re making enough money, you don’t need to make money from our children. You just don’t. I mean, there’s enough money to be made in general. But they know it very well that they want to capture those customers -because for them they are customers- at a very early age and hook them on platforms out of which they will make money.
And, of course, we have not at all spoken -because right now the topic is children in the age of AI. We’re still talking about the digital platforms, which, of course, use a lot of AI- we have not yet at all spoken about what AI means for education.
There are great opportunities, but also great challenges. Because, at the end of the day we need to understand that our brain has evolved over millions of years through our interaction with the physical world. So, there are skills which we acquire as we grow up which are necessary in order for us to become fully-functioning adult humans.
We’re not throwing at our kids a completely different sensory world. They live in two worlds, they live in a physical and a digital world. We have no idea how this is going to turn out.
But how are you going to tell your kid about the temptation of using any large language model, any ChatGPT, for doing their homework.
It’s as simple as that. The temptation, it’s too big. I mean how are you going to resist that? I don’t have an answer, but I know that we are asking our kids to do something which they simply cannot do. We can’t resist it. We can’t resist it frequently. So if we can’t resist it, if we recognise that we are addicted, how can we expect our kids to believe in a different way? They won’t.
Regarding the role of parents and, above all, technology companies in efforts to tackle internet addiction of minors, the Prime Minister noted:
I agree with you that, at the end of the day, we cannot also put too much emphasis on what parents can do because parents will be competing, will be fighting with their children, but it’s part of the growing up process. I mean parents have to set boundaries not just in the digital world, everywhere.
So I mean to the parents they tell me “I don’t really want to do this”, well I said: “sorry but that’s part of the responsibility as a parent”. But you’re absolutely right that there’s only that much that we can expect from parents to do, but we need them very actively engaged. At least they need to know what already exists, which they don’t in terms of the tools.
But I think the default settings are incredibly important because at the end of the day, what we want is to break the cycle of addiction. So, we want to break the scrolling pattern and this could happen through appropriate default settings.
Let me just give you one personal example, which I found very interesting. When you go into, for example, YouTube, YouTube knows what you like. So you go into YouTube and they start serving you the videos that they know, maybe you’re interested in cars or whatever. So I go in and there are the videos.
Now, what happens if you turn off the history setting. You go into YouTube and there’s a blank page. Okay, I still need to type in what I want to see. I can tell you the usage is going to drop tremendously, because every single time you go into the application, the application is not going to serve you the content that it knows will interest you. So the addictive pattern needs to be broken at the beginning or it needs to be broken during the time of scrolling. “Stop. You’ve done it for a minute. You’ve seen three things. I block you. You’re done for the day. Get out. Come back tomorrow”.
But all this is premised upon the design of the product. There’s only that much you can do as a parent to put restrictions. So, I absolutely agree with you. We need to recognise where the problem is. We have two, I think, types of problems. The first is the addiction itself and the time we spend. And, of course, time is finite, and we need to ensure that we also offer our kids other opportunities.
So, for example, for me, the best thing we can do for our kids and our teenagers is sports because inevitably, if you do sports, you won’t be on your phone. Building infrastructure and focusing on sports at a younger age is probably the best thing we can do. Then there is the content. Because not all technology is bad. How do we reward creative technological ideas? Doing, for example, a puzzle of a painting. How do we encourage that interaction with technology, which we know is useful, while essentially punishing what we know is detrimental.
I think you’re right. A lot of that will depend on the companies themselves designing their default settings in a way that will break this addiction. Of course, there will always be illegal content, problematic content, what you mentioned about pornography and about more violent behaviour.
It is also a question of moderation at the end of the day. The more the companies withdraw from this domain, as if they don’t care, as if they’re just platforms that are not responsible for the content, which was the root cause of the way the US regulatory approach looked at them, that you’re not curators, you are just platforms. Of course, the extreme arguments behind freedom of speech are pushing us, unfortunately, in the opposite direction.