Automatic Transcription: it could contain errors.
Alessandro Oppo (00:00) Welcome on another episode of Democracy Innovator podcast and our guest of today is Alex Blaga. And thank you for your time, Alex.
Alex Blaga (00:11) Thank you, Alessandro. It's a pleasure.
Alessandro Oppo (00:13) And you're working on a Trollwall AI, right? And would you like to tell us something about this project?
Alex Blaga (00:19) Yes.
Yes,
we're getting into it right away. see. ~ Yeah. So the project that I'm working on at the moment is called Troll.ai. ~ As the name calls it, it's an artificial intelligence platform with an LLM behind it that we've built ourselves. And in very short terms, and perhaps we can develop later on, we focus on moderation and community management on social media.
with a focus on toxic, harmful comments. We identify the most toxic, the most harmful comments on social media in the comment sections and we take care of them so that we make, so that we can make the internet and the social media in particular a better place and a place that is more, that looks better for discussions, for debates.
Alessandro Oppo (01:24) And I can imagine there is a story behind this software. I don't know, you had the idea, one of your co-founders had the idea when it happened.
Alex Blaga (01:35) Yes.
So one of our co-founders that is currently our CEO was working in the European Parliament. He was working for the vice president of the European Parliament. He was doing specifically the community management on social media. And back in 2022, which we all know coincides with the invasion of Russia in Ukraine, together with the waves of migrants,
that came to Western Europe, also a wave of harmful comments and certain narratives, toxic narratives came at the same time in the comment sections of our social media platforms. together with the waves and the narratives, the flow and the sheer volume of those comments became unbearable for a lot of political actors.
specifically for the vice president of the European Parliament back then, and many actors, but also businesses as well. So they were seeing comments like, we don't want these Ukrainians. They should go back. They should stop stealing our jobs. ~ Why are they not staying in their countries to fight? what was really interesting was that these messages were sent ~ and composed in a really violent manner.
what is our opinion and the opinion of our clients in an unacceptable manner, in a toxic, unacceptable manner. So at the time, my colleagues, my now colleagues looked around and they couldn't find a solution that could handle that volume, that toxicity effectively. So they got together.
The team slowly expanded and we've built a company around this idea of making the social media a better place. And it all started in 2022. We're now in 2025 with over 70 clients present in over nine countries in Europe and Latin America, and also supporting now four heads of state.
Alessandro Oppo (03:43) and it all started in 2022.
Okay, and so there are different kinds of clients, customers, so they can be ~ political parties or institutions or like also companies. ~
Alex Blaga (04:17) Sure. So although we've started in the political sphere, naturally, we slowly but surely migrated to the commercial side of things because unfortunately, that toxicity that oftentimes
starts in the social, in the political spheres, slowly but surely migrates to businesses, to the commercial side of our society. we see on our clients, such as the news media, media houses that suffer from the same toxic narratives, from the same toxic comments. And the results are
different and the effects are different, but they're just as harmful for everybody, regardless of the nature, whether the organization that we work is political, ~ commercial, or NGOs and institutions.
Alessandro Oppo (05:19) An interesting thing ~ is the large language model built inside the company. you use the... I mean comments that were probably made by trolls or AI bots and you train the AI, right?
Alex Blaga (05:39) Yeah, so
we trained our own large language model specifically in a number of languages, mostly European languages. And I'll tell you in a minute why we focused on European languages. We work with, we do still work with linguists in specific languages so that we can understand the context, the local context of each country and each
region, each language in particular, really, really well. And by understanding that ~ local context, we can identify subtleties in the language and so filter sentences and comments much more effectively, much more effectively than the social platforms can do by themselves. When I say social platforms, I mean the big meta, the Googles, ~ the TikToks and so on.
which natively, of course, they do moderation by themselves. But as we all know by now, they do it really, really poorly. So all the toxic content and also comments present on their platforms is because they don't want to, or they cannot do moderation properly.
Alessandro Oppo (07:02) I have some thoughts regarding, but I would like to ask you something about maybe your professional background, if you'd like.
Alex Blaga (07:12) Sure. So I have a very mixed ~ background in the sense that I have a love for on the one hand, business, politics, and also communication on the other side. So my academic background is in politics and international affairs. Soon after I finished my studies, ~ I delved into the world of
business and communication through various roles. And I'm happy to say that the project that I'm working on right now, Trollwall, really combines these three aspects of ~ my life, which I'm really passionate about really, really well. the communication part, the social media, the business, it's an obvious one, but also the ~ political aspect of it.
Alessandro Oppo (08:11) And if you'd like something more about ~ your personal background, don't know, like where did you came from? ~
Alex Blaga (08:21) So I'm
originally from Romania, but I spent probably now half of my career in the United Kingdom, where I also completed my academic studies, where I also worked for a number of years. And now I travel in between my home country of Romania and London, England. So if you're going to ask me where my home is,
Unfortunately, I don't have an answer at the moment. I'm a citizen of Europe. Although my fellow English friends might disagree that the UK is still Europe.
Alessandro Oppo (08:54) moment.
Yeah, it's a complex situation. I was thinking about... Because comments can be very toxic and at the same time there could be also a good manner to say that you disagree about something. So how can both...
Alex Blaga (09:11) It's complex situation. Let's not get into it now. Maybe in the second podcast.
Alessandro Oppo (09:39) happen? ~ Because I understand that nowadays with automation, technology, AI, like a country or a political party could actually attack another political party or another country, ~ showing that the population is thinking something, but maybe this is not true. So how to protect the... ~
can be a political party, be a politician, can be a country, from these kinds of attacks, but at the same time also allowing people to say, ~ I disagree about this specific thing.
Alex Blaga (10:21) Well, the quick and easy answer and probably the obvious one would be moderation. Political entities should take care of their moderation. For the sake of this discussion, let's focus on social media. So you mentioned attacks and these attacks and the...
The influencing of minds by certain political actors happens in a few ways as far as we've been able to identify this. So we mentioned already the sheer volume of comments. So these attacks always, always come with huge volumes of comments that want to carry a certain narrative.
In other words, the more you repeat an idea, albeit lie, at some point, somebody is likely to believe that idea or that lie. So the volume that I was talking about. On the other hand, another interesting aspect that happens is the violence that I already mentioned, the toxicity of the comments.
The same narratives also come in a violent manner so that normal people, normal citizens that would usually engage with political content get deterred from engaging in the discussion. So they get scared, they ~ pushed away from getting involved in the discussion and starting an actual debate.
over a set of policy, ~ over certain ideas, any kind of discussion in a violent environment is less likely to happen. So on the one hand, you push them away. And on the other hand, you repeat it enough times until somebody believes you. And there's so many examples out there in Europe. If we looked at the Brexit vote, it's an easy example.
If we look at the elections in 2024 and 2025 all across Europe, we see this pattern on and on and on with certain actors pushing certain narratives through the comment sections. But not only, of course, they rely on things like bot farms and troll farms to push certain narratives enough times. But specifically, we focus on
on the comment sections.
Alessandro Oppo (13:16) And I was wondering, ~ this kind of attacks came from foreign actors, mainly like outside Europe or also inside Europe or party against party, I don't know, inside the European Union, inside a specific country. ~
Alex Blaga (13:34) I would say both.
However, when it comes to the volume and the resources, they usually come from ~ outside of Europe.
Posing parties from within a specific country, they also look at what worked for others and they slowly adapt the same tactics, but often with smaller resources. So they don't have the resources that state actors often have. So we see the same kind of attacks, the same ~ kind of narratives.
posing narratives in between ~ parties from a specific country, but at a lower scale or a smaller volume than compared to two state actors. But what it works for states also works for any kind of actor.
Alessandro Oppo (14:35) Yeah, I'm thinking about the future, because nowadays it's so easy to create bots, and now also with AI probably everyone will be able also to say, Alexa, create some bots. And so, they talk about
registration with identity, to know the identity of a person. ~ I think also to avoid these kind of trolls. At the same time that profilation is something that I don't know, not everyone would like to have a sort of identity connected to...
the internet.
Alex Blaga (15:32) Yeah. So if
so far I have been speaking, let's say in the name of the project and the company that I represent, I would also like to give you a personal opinion and a personal view on this. I do think that ~ both Roll and Bob Farms are an issue and are an issue that we could address because it's so obvious by now that it's affecting our
democracy overall. And I am in fact a big advocate of building social media platforms that rely in fact on identification. banks can do it if your cell phone provider can do it and you have to verify yourself in order to open a subscription or a bank account. Why wouldn't you?
be obliged to do it on social media. So it's fine if you have an opinion, maybe even a controversial opinion, but there's an identity behind it. And it's so much easier to control what's going on on social media. And that's not to say that your opinions, your ideas shouldn't be valued or shouldn't be taken into consideration. Quite the opposite.
So you know who's behind it. And if you say something really, really toxic or if you promote certain violent ideas, you could also be held accountable, which is only normal in my opinion. But it would also fix a lot of other smaller problems. And I do agree that might just be the solution.
Alessandro Oppo (17:18) also fix a lot of...
I wonder, because I see both pros and cons and ~ actually I don't have a clear idea about it ~ because I...
Alex Blaga (17:36) What would you say the contours?
Alessandro Oppo (17:39) As I said, I see the pros about the concert. ~ It could be, but I'm not sure, of course, ~ that people then they don't feel free to say what they think about like because. ~ But I think it's maybe related to. ~
If the citizen feels that he or she is safe in that country, in that... ~ I think that the person maybe could ~ say what... without applying auto-censorship. Because often I think we end up doing it.
I don't know if I was clear.
Alex Blaga (18:39) At the same
time, isn't democracy all about protecting personal freedoms? So if the democratic system is healthy enough and strong enough, you could say anything, of course, within certain boundaries and nothing would happen to you. You would feel protected and you would be in fact protected by the system, by the state, and you would be defended.
On the other hand, when democracies become less democratic, if we can put it like that, that's when personal freedoms and ~ personal rights become an issue. Isn't it actually in less democratic and more dictatorial states that citizens are more at risk? If they say something without giving names.
Alessandro Oppo (19:36) Did you say something about...
Alex Blaga (19:39) of states, but we could of course call China, the USSR, and other more tyrannical states.
Because that is the alternative, right?
Alessandro Oppo (19:55) Yeah, exactly like I can suppose that maybe ~ also in those places, I mean, they can receive attacks. And so I wonder like, if in those places where that are less democratic, like an identity to access to social network is... ~
is applied, so then people would not be able to express ~ their idea without having... Like in places that are not very democratic, if users express their idea, then they can be ~ punished.
Alex Blaga (20:45) Sure. And this
is where the other really important pillar of a strong democracy comes in, which is the rule of law. When you know that institutions will protect you, you won't be scared to voice your opinions. But in less democratic systems, that rule of law pillar, that umbrella over everything when you know it's not there,
who's going to be there to protect you. So they go hand in hand. And I do agree that in states like China or maybe Russia, something like this, maybe it wouldn't be wise to be implemented in Europe on the other hand, and some countries in Europe. I don't see why not.
Alessandro Oppo (21:39) But I really like to think about these kind of topics because I think that... ~
In some way, as we said, there are some pros, are some cons, and we are the one thinking about problems and solutions. We are not the one that will decide if there will be an identification on such a network, but I think that is very helpful talking. Yeah, exactly.
Alex Blaga (21:52) Yes.
Maybe not today, but fortunately
we do have the power to vote. We do have the liberty to debate and to have these kinds of discussions, unlike in many other places in the world. But here you go. That's ~ another good idea for a third podcast. We're going to make a series out of this and make proper discussions.
Alessandro Oppo (22:29) Yeah, will be absolutely awesome. And I was thinking like, ~ do you know if other countries, like we mentioned China, we mentioned Russia, or whatever, ~ if they are having some other kind of attacks, some, I don't know...
between countries, non-European countries or maybe by European countries. Yeah, I don't know. If there are also companies, maybe similar to the one like to Trollwall in other places that are non-European places.
Alex Blaga (23:14) Are you referring to competitors of ~ ours that offer the same service that come from places like China or Russia?
Alessandro Oppo (23:23) Yeah, I wonder also if Russia has ~ the same problem about other strong countries or whatever, that they are also using these kind of attacks.
Alex Blaga (23:42) It's hard to say in my opinion. ~ One, because we don't really follow what happens on the Russian social media and the Russian ~ internet. Also, when it comes to China, if you take into account the great wall of the Chinese internet that blocks everything, doesn't have any of the big social platforms that we do, doesn't have the Googles, the YouTubes.
while at the same time we accept TikTok.
It's hard to say, but saying this, it does look like they have more leverage over what's going on in the West than the leverage that we as Westerners have over what's going on in China, Russia, and other similar countries. But when it comes to attacks, I honestly wouldn't be able to tell you. I'd like to believe that...
we ~ reciprocate what we get is also what we give back in kind but I wouldn't know for sure.
Alessandro Oppo (24:59) Yeah, it will be very interesting to know all these, I don't know how to call them, attacks of bots. ~ Because, I mean, people are not aware of them, they just read a comment and they think, okay, that person is quite mad about a certain thing. But it could be that it's just AI.
Alex Blaga (25:26) It often is. It often is. all these ~ farms that propagate certain messages, it's more more difficult to tell if behind a profile or behind the comment is a real person or ~ just an algorithm. It often happens that you end up arguing in the comment sections over a stupid idea with an algorithm that there's no person behind it.
And while at first it might look funny and you might think, I just spend 15 minutes of my day arguing with nobody, with a computer. I think when it comes to decision makers and institutions and political parties and brands, it should make them think that if...
we can already do these things with artificial intelligence, we should really take artificial intelligence much more seriously. And we should be trying harder to fight fire with fire. And as a matter of fact, this is what we do. So if ~ actors fight us using all sorts of AI tools,
We have no choice but to build our own AI tools and fight them back. And that's what we do, at least in the case of moderation.
Alessandro Oppo (27:03) And yeah, I was thinking that actually now with a comment is quite easy to not be able to understand if there is a person behind or an algorithm. Nowadays also with a video. So we are quite, ~ this is quite problematic. And also I'm thinking that... ~
arguing actually with a person on internet, just chatting. I saw that it doesn't matter about the language, but most of the time is toxic. This is from my experience that it is not very easy to communicate with a person ~ about complex topics, just chatting, because we cannot really empathize with the other person, we don't see the other person.
Alex Blaga (27:53) Of course.
Alessandro Oppo (27:54) So
I'm thinking also about places where this can happen, could be in real life. And because of your passion that you mentioned, one is technology and the other one is politics, I wonder if you thought about these where maybe debates or like contamination between different thoughts and ideas can happen.
Alex Blaga (28:27) Yes, so ~ I was mentioning the narratives that are spread. And of course, they can happen. And you raised two ideas there. the...
conversation that you have with, and perhaps oftentimes an AI system. Were you referring strictly in the comment sections when you were describing?
Alessandro Oppo (28:57) I was thinking that someone posted on a Facebook page and then there is the comment section, ~ but could be also in other places of social network. ~ But yeah, we can think about the specific example of the comment section.
Alex Blaga (29:02) and
Yes.
of course, minds can be changed. Ideas can be really be pushed forward. Narratives can be spread really easily. Now, even in a conversational manner, as you said, you can chat with a bot and it can have the same kind of idea and can almost sometimes convince you of a certain idea.
Um, but as you said that conversation doesn't feel okay. It doesn't feel natural. It feels kind of, uh, toxic. Um, and we, we do understand that because often what happens is that a certain AI, uh, system and AI bot uses simple algorithms of open AI of chat GPT. There's a chat GPT, uh, interface to it.
which doesn't really understand local context. It doesn't really understand the local history, the local developments. It just has simple ideas that it knows it needs to push forward. But as I said, the key to that is just the repetition of the same ideas on and on and on and on. And we really resonate with that local context that this is why at the beginning of the conversation, I told you that
When it comes to the languages that we've developed, we focused on mostly the European ones. This is on the one hand where we work, but on the other hand, we understand the fact that the big players, the metas of the world, the Googles of the world, usually focus on big languages. So English.
German, in some cases, maybe even Italian, but when it comes to smaller languages like ~ Ukrainian, Romanian, Bulgarian, Greek, smaller languages, they don't really spend much resources. They don't really spend much time on developing those languages. But as a result, the moderation they do in-house on those specific languages is really, really poor.
So they work with very few linguists, they work with very few local experts and very few moderators at the end of the day. And this is what we do differently. ~ We process enormous sets of data, amounts of data with local context so that we understand what people talk about really, really well.
And that's, I guess, the key to, on the one hand, keeping that space clean from bots, from trolls, from toxic in general, but also to promoting healthy discussions, healthy debates.
Alessandro Oppo (32:00) That's I guess the key.
and to maybe give an idea about the amount of data. Do you have an example to share?
Alex Blaga (32:32) But when it comes to data, what I can tell you is that it fluctuates a lot. It's like a sea when waves come and go. We've seen the biggest heats of the data we processed, of course, naturally around electoral events. So in 2024, we probably saw the highest peaks.
of data of comments being dropped on social platforms. One interesting case we worked in were the Romanian elections of 2024, with some of our clients having over 500,000 comments on each social account per month, which is an enormous amount for a small country, a small language.
So they usually have about three or four social accounts ~ per political party that would equate to over 2 million comments per month for just one political actor. Just so you imagine the kind of scale that was going on.
Alessandro Oppo (33:56) I was thinking about the infrastructure and the AI model that has to analyze that quantity of message. But yeah, with technology it's quite easy to do that, but manually no. And I was thinking, how do you imagine the future? Okay, let's think about...
Alex Blaga (34:11) Yeah, absolutely.
Alessandro Oppo (34:24) mean, democracy, ~ like in 10 years, 20 years. Have you ever made this?
Alex Blaga (34:35) I tried a few times and then a couple of years would pass and my predictions would ruin themselves, especially with the advent of ~ AI and technology. I mean, just looking back at how the world looked like five, 10, 15 years ago. ~ I mean, at the beginning of my career and how I got into
politics, international affairs and communication. It was back in 2014 when again, Russia had invaded, allegedly invaded Crimea and they took over the Crimea with their little green men. Now, just a little over 10 years, there's a full
fully fledged war going on in Ukraine. And we in Europe back then used to see the United States as the beacon of democracy. We would look up to them and think, this is an example, this is we should aspire to. Now, than 15 years later, we look at what's happening in the United States, what
~ Trump is saying and how their democratic system is becoming less and less, ~ fragile. ~ it's almost like they, they can say anything and nobody can dispute them. No, nobody can, ~ can really debate them. And at the same time in Europe, we seem to hold with our teeth to democracy and really want to protect it and go forward with it. I think.
Looking forward, inevitably, one of the biggest players and the biggest decisive factors of our democracy will be the aspect of war on the one hand and on the other hand, technology.
how we go about and what we do about the war that's going on on our doorsteps and how effective we're going to be at fighting it. On the other hand, technology. Will we be able to protect our democracy? Will we be able to use technology to develop our democracy with things, as you said, perhaps developing social networks that
require you to authenticate yourself and you can only create one account per ~ social ID.
Or will we completely crash it by ~ making it worse and worse with the likes of TikTok? That is clearly, in my opinion, an enemy to democracy. ~ I'm returning to TikTok here.
So yeah, I would say it depends on these two aspects and it's ultimately up to us to make the world ~ a better place. ~ I can confidently say that specifically what we're doing in our team is exactly this. We're trying to make the world a better place by making the social media a better place. And in turn, as a result,
open democracy and building democracy.
Alessandro Oppo (38:16) I'm curious about TikTok. ~ Do you think it is a threat for democracy more because it comes from a foreign country or because it's very addicting for people? Because I also tried it and it is a... you cannot really stop looking at it.
Alex Blaga (38:34) Yes.
So when it comes to TikTok, I see a lot of decision makers, a lot of people on televisions and podcasts and so on, really hiding behind their words ~ when describing TikTok. I'm going to be honest and more direct because that's how I like it. I think it's definitely a really nasty drug.
and it should be avoided. So on the topic of addictiveness. On the other hand, of course, it's a tool that comes from the outside world, if we can call it like that. It's made by ~ a state which is not necessarily our friend, maybe not anymore. And ~ it's a dangerous tool.
Let's just look at the way it's used and treated at home, on home ground. So Chinese kids have access to a very similar platform, but with educational content. Now, have you ever seen the feed on the Chinese TikTok for Chinese kids? How it looks like compared to what we get in Europe and the West?
Alessandro Oppo (39:56) I saw something about what I haven't really investigated a lot.
Alex Blaga (40:01) Yes, so
it has very clear limits. So they can only use it a number of hours a day. It's not unlimited like we use it here. It's 24 hours a day. And the content is strictly controlled, and it focuses on educational content. So crafts, new languages, engineering.
And then we look at what we get in Europe and the United States, the nastiest of the content that rots our brains.
Alessandro Oppo (40:39) kid this. joking. But like, yeah, yeah, I understand that sometimes.
Less educational content is the one that the people search for, especially if they are young.
Alex Blaga (40:50) And it is not just a...
Yes. And
it is not only my opinion that it's harmful for citizens in general. ~ But if we look at, again, I'll go back to the Romanian election case of 2024. We ~ had this one candidate, ~ independent candidate that received
Now, just in the last few weeks, it has been proven that it has received support from the Russian state, mostly through TikTok. So the Chinese control TikTok and they took through the meddling of algorithm and through pumping millions of dollars. They skyrocketed his accounts to number nine globally in 2024.
So a nobody candidate was skyrocketed on number nine in the algorithms, in the tags used on TikTok, and ended up winning the first round of presidential elections. So going back to democracy and the threat that technology and certain states can pose to it, in a matter of months,
They can change, they can almost, they didn't really succeed. They can almost change the whole structure of a democratic state through one single app, through some algorithms, through some AI narratives and just a few million euros.
So it's clear by now that it's not just a simple tool, it's a weapon. It's a weapon for hybrid warfare. And we shouldn't be hiding behind our politeness, our European politeness. We should call it the way it is.
Alessandro Oppo (42:42) Yeah, yes.
Yeah, yeah, absolutely, like this system. But I'm thinking also about the other big tech, like also X-Twitter in some way. We don't know the algorithm that there is behind. We don't know, like, also for Facebook meta. Like, I would say that maybe I actually would like more transparency related to the algorithm.
Because I think, yeah, absolutely, like information is power and with information you can change ~ the power that is... Yeah, I political institutional power can be changed.
Alex Blaga (43:45) And if it wasn't a web content, if it wasn't really as important as we say it is, the United States wouldn't fight so hard to get control over TikTok in the United States. To take control away from from ByteDance and control it themselves and...
heads of state, Trump and Xi Jinping. I think it was last week. They were supposed to meet or it was recently in any case to discuss specifically TikTok and them giving up control to the United States over not the app itself, but the algorithm because that's really what's at stake.
Alessandro Oppo (44:38) It's incredible ~ how can be important and how they can change people's life because algorithm at the end of the day is like, I don't know, a page with some lines ~ with something written inside but then it keeps people... ~
like how do you say like in front of the screen for days for many hours. exactly. Exactly. And I mean, we saw that technology now it's very important for I mean, it's in every in every aspect of our life and also related to politics. And I wonder if you
Alex Blaga (45:08) Yeah, it keeps them locked just like a drug would do. Of course.
Alessandro Oppo (45:31) thought about any other I would say let's say good use of AI ~ for politics for democracy ~ I don't know sort of assistant
Alex Blaga (45:50) Yes, we are, fortunately and unfortunately at the same time, our roadmap is so long, it's so big with so many ideas that the biggest challenge is picking the one that's most relevant and the most important of all. Just to name a few, we've recently...
Actually, a few months ago, we've released some new features. I would mention the drafting of answers ~ for our clients. in very short, our clients can build their own databases. They can create their own assistants where they can upload certain files. And based on those files and that database, we can draft answers for them.
for the comments they receive on social media. Now the key word here is drafting. We never reply on their behalf on social media. We wouldn't allow that to happen. But although it might seem like a basic, like an easy idea, drafting answers, that really allows, in our case, political actors to engage more with their audience.
This is an issue a lot of them face right now. So they simply don't have the resources to pay people to sit down in front of computers, read through the comments and reply to comments and engage with the audience, which I think it's so important and it's so critical for, again, for democracy that when political actors, when you're an MP,
hosts about something, a new piece of law that he or she ~ supports. And you want to ask them a question. Oftentimes people, citizens will do it on social media in the comment sections. And again, oftentimes they will never receive an answer because they simply don't have the resources to reply. And this is what we try to do. We draft the answers for them.
which they can tweak and really engage with the audience. Of course, we also do ~ other more community management related features such as ~ sentiment analysis, which again, it's really important for political actors so that they can better understand the pulse of their voter base, of their audience and what people
feel because it's really, really difficult to read through and to gauge the sentiment of hundreds, thousands, hundreds of thousands of comments and the opinions of ~ so many people.
Alessandro Oppo (49:01) I was thinking about this approach of using social networks for also in this case educating people, not really educating, informing people about a certain thing. So if the citizen asks a question instead of having the president that is replying, a chatbot is replying but using all the knowledge.
And because several times, in the Civic Tech field, there is this thing about building the platform from scratch. And ~ while on social networks, there is already a user base because most of the times they end up not being used, the software related to Civic Tech. ~ While...
Alex Blaga (49:32) Yes.
Alessandro Oppo (49:58) People are using Facebook, Twitter and so on. so tools that run on those platforms, they have probably a higher probability to be actually used.
Alex Blaga (50:12) Yes.
At the same time, would say I fully agree. A lot of political tech ends up being unused. But I do appreciate and I do encourage politicians specifically to try new tools, to be open-minded and to adopt new tools, going back to the idea that I mentioned earlier, ~ fighting fire with fire.
Alessandro Oppo (50:15) to see.
Alex Blaga (50:46) I don't see successful politicians in the near future, even winning election cycles without using the latest technologies, without using ~ artificial intelligence at a large scale, whether it's replying to the audience or segmenting the population and the voter base, they have to adopt and they have to adapt really, really fast.
many of those tools that they try, it's bound that some of them will be not very useful and they won't ~ end up adopting them. But I do think it's important that they stay open-minded.
Alessandro Oppo (51:34) And do you think that nowadays politicians are aware of these kind of tools? ~
Because a lot of people that know about politics, then they don't know about technology.
Alex Blaga (51:58) That is very true and ~ somewhat painful for us, if I can put it like that. I do agree that politics is a weird industry. It often feels like a task that in Europe...
You have the political families, you have the conservatives, you have the leftist ~ that even at a European level, they don't really talk to each other. So they will vote together in the European parliament, The left is from ~ Italy together with the left is from France and Germany. But when it comes to sharing ideas and sharing tools that
then doesn't really happen. Like it happens in the United States, for instance. So you would have the Democrats or the Republicans from one state to another really sharing, exchanging ideas. In Europe, this doesn't really happen, and ~ it's quite unfortunate. So in our case, we go country by country. ~
political party by political party ~ in trying to open their eyes one by one. But what I would also say is that I'm happy to see that ~ events such as the political tech summit ~ in ~ Berlin is a much needed event together.
professionals from the political sphere and let them change ideas. And I know the organizer of the event was present on your podcast just a few episodes ago.
Alessandro Oppo (54:01) Yeah, it was very interesting also talking to Joseph Lensch and I have just a couple of questions more and so ~ What is for your democracy? like from also from a political science point of view or liking Your opinion
Alex Blaga (54:06) Yes.
Very good question. I should have known you're going to ask me this. It's in the title of your podcast. ~
Alessandro Oppo (54:49) I
don't always ask for it, I think it's interesting because a lot of times we think about a concept. we, don't know, capitalism maybe is different for me and for you or democracy and it's part of human nature and I always like to see also the other point of view of other people.
Alex Blaga (54:54) You don't, right? You don't know.
Yes.
I think without realizing, I described it earlier on. ~ I mentioned liberties. think first and foremost, democracy is based on personal and civil freedom and liberty. But at the same time on the rule of law, this is just to make sure we stay within certain boundaries.
And if something happens, if our civil ~ liberties are threatened, somebody or something will protect those liberties. ~ And I know this is a really simplistic way to describe it, but this ~ is the way I see it. And yeah, I believe that's the core of it. Liberties and protection and strong institutions.
Alessandro Oppo (56:14) Thank you. do you have any message for the people that ~ are working in the space of political tech? So people that are, I don't know, finding new way of governance, maybe using technology or like ~ tools similar to Trollwall?
Alex Blaga (56:42) Yeah, again, I think I mentioned it just a few minutes ago. I would urge them to stay open-minded and to at least try new methods and new tools if they are to stay in the field. That's one idea. And the second one is to communicate between themselves, is to communicate with each other even if they're from opposing.
sides of the political spectrum. There's so much they can learn ~ from each other. There's so much they can adopt from each other. And at the end of the day, all of us would gain from it.
Alessandro Oppo (57:30) So thank you Alex, thank you a lot. ~
Alex Blaga (57:33) I actually have a question for you, if I may, if you're not running out of time. You asked me a really interesting question. What was democracy for me? What does it mean for you?
Alessandro Oppo (57:36) Yeah, absolutely.
Good question, actually. I was not prepared. No, I...
Alex Blaga (57:52) Me neither. There you go.
Alessandro Oppo (58:00) As I said, I think there is in some way a lot of confusion related to the term because...
mean demo and Kratos, so it's like power to the people. ~ But I think that, I mean nowadays the democracy that we see is quite different from the democracy as it was conceived. ~ Because ~ nowadays we have the representative system ~ that is...
It was defined as... There is the famous phrase by... It was from Churchill, if I'm not wrong. That is like... I don't remember the exact phrase, that democracy is very bad, but all the other systems were even worse. And so...
Alex Blaga (58:58) words. Yes.
Alessandro Oppo (59:03) But I think, as you said, that also if people are, I don't know, someone who was always from, I don't know, voting left, leftist parties, because from his family and so on, and the other person is, I don't know, voting for another party. I think that is nowadays with technology, I think that some paradigms can also change.
This is why it is so important to talk about ~ what to do with democracy. Also because there is AI that is changing everything and ~ there are new ways of doing worse, maybe with farm bots and so on. And so I think that we should really... ~
sit down at the table. It doesn't matter if I'm left-wing or right-wing. We should all sit at the table and decide what to do. as we said, maybe before the interview, democracy can be, in the future, be like a...
like, I don't know, the best place where everyone is happy and so on, or it could be like a worst place ever. ~ So now we are humans and humans sometimes in the past were not really, how do you say, good. Now we have technology and also using technology we really... ~
We didn't really make a good use of technology, have to say.
Alex Blaga (1:00:59) Historically speaking, yes, you're probably right.
Yeah, but are you more pessimistic or more optimistic? Do you think it's going to turn out really, really well or really, really bad?
Alessandro Oppo (1:01:04) Yes.
I am... I don't know, but I would say optimistic in the long term and pessimistic in the short one.
Alex Blaga (1:01:23) Good, me too.
Yes, let's leave it like that. Let's leave it on the optimistic side then.
Alessandro Oppo (1:01:30) Okay. Okay. So thank you a lot. It was a...
Alex Blaga (1:01:34) Thank you. Thank you, Alessandro.
It's been a real pleasure.
Alessandro Oppo (1:01:37) Thank you.