In Conversation with Enrique Dans and Diego Hidalgo
In the second part of our “In Conversation” series, Enrique Dans sits down with Diego Hidalgo to discuss innovation, regulation, and the role of education in mediating our relationship with technology.
© IE Insights.
Transcription
DH: My name is Diego Hidalgo, the author of Anestesiados: La humanidad bajo el imperio de la tecnología. I am a technology entrepreneur by profession, and I have set up several companies, including Amovens, the first car-sharing platform in Spain.
ED: My name is Enrique Dans, and I’ve been a Professor of Innovation and Technology at IE University for years. I do and publish research and I’ve written several books, especially on the subject of technological disruption and its impacts.
Hi Diego, how are you?
DH: Nice to meet you, Enrique.
ED: I think this is a very good time to have this conversation. I think it’s a time when things have been moving forward for about two or three years. They are advancing in a way that makes the conversation probably very different now than it would have been only a short while ago.
DH: You’re right. I think that technology and above all the relationship people have with technology and the way people look at technology has changed a lot in the pandemic and our lives have become very digital and, at the same time, this situation calls for perhaps a deeper reflection on this subject.
ED: Yes, it is strange, on one hand, as you have said, the pandemic has pressed the fast-forward button. Two years ago, if somebody had a videoconference call they would panic, call someone from IT and have them next to them just in case something went wrong. And now, they just get up in the morning, press the button like this and go straight into a Zoom. Although, on the other hand, there is something else. Companies had been able to do practically whatever they wanted for about twenty years, and are now starting to find themselves in a different scenario.
I think the European Union has always been the great regulator, with regulation almost being ahead of the technology, where they try to regulate the “just in case”, always, even if it hasn’t happened yet. While the United States and China went in the other direction. Though strangely enough, for the last two years now, both the United States and China are moving into a regulatory environment that is almost following in Europe’s footsteps.
DH: In my opinion, I would like to see China follow the European model. Personally, especially in the field of technology, I am a great believer in the potential, or at least the initiative of the European Union.
ED: You are a tech entrepreneur and I’m sure you have come across problems with getting a company up and running in Europe and you know perfectly well that it would have been much easier if you had lived in the United States, and you had been based in Silicon Valley.
DH: Yes, I think the problem is that we’re really at a point where the potential of technology is so great that, although purely from a business point of view, and it’s true that as an entrepreneur and that’s why, in my case, I wanted to be an entrepreneur before I started writing, and not writing from a purely theoretical standpoint, because I wanted to be in touch with the business world and other areas. And I think there is a whole world of regulation that doesn’t only affect technology companies but also any type of company that actually makes trying to set up a business in a place like Spain not at all easy and it is really frustrating as an entrepreneur.
ED: I think that the problem… There is a commitment. If you start regulating too fast, what has happened in Europe happens, there are practically no technological giants in Europe and, to a large extent I think it’s due to a problem of excessive regulation at the beginning.
DH: Putting innovation before everything else, I mean, choosing innovation as the number one this worries me, because I think that innovation has to be something which is really good for human development and humanity. But the fact is that there are certain types of discourse that say well, we can’t do this or that, we cannot regulate one way or the other, because it will slow down innovation.
ED: If regulation has to exist in a world that has become so globalized, that technology has globalized like the one we live in at the moment regulation becomes a factor, if you like, for competitiveness, and the country that does not regulate it has higher production growth, for example.
DH: Regulation can often put you at a competitive disadvantage and this is a major issue. However, actually, I think the question is to find out what the hierarchy of values is, and we are dealing with technologies that have a potential for control that is so overwhelming, let’s say, by states or corporations over the individual, that even if it means becoming less competitive, I think it’s something that we should try to limit and regulate sometimes, not prohibit, but, at least, we should set very clear limits.
ED: In western societies, democracy, human rights, etc. are a core value and innovation logically finds it very difficult to go against them. A company that offers a product that violates human rights, that violates democracy etc., tends to end up having problems. Machine learning is a tool that can learn from a bunch of inputs, from a whole lot of data. We can sense that it could be dangerous. In the 60s and 70s, when research began, if someone had said or had convinced an important part of society that researching in that area was potentially very dangerous, we might not have researched in that area.
DH: There is a huge question about whether artificial superintelligence that gives us that cross-cutting intelligence which is more like the human mind, will exist one day. But there are people who are total technophiles, such as Bill Gates, who says I don’t understand why more people are not concerned and extremely worried about the emergence of artificial superintelligence, because it would actually be another intelligent species that could clearly compete with the human race.
And so, when we think about evolution and obviously not ignoring the incredible major anthropological changes, that have taken place throughout history, I think they are changes that call for some humility, in the sense that they have taken place over several generations. And now we’re talking about extreme disruption at breakneck speed, where this project to transform human beings, which I am personally critical of, I don’t think it has that many precedents, because to start with it’s a project that’s highly concentrated in the hands of a few organisations that are steering and planning it and applying it at a much greater scale.
And they are changes that are taking place very quickly and that also have an impact even on what makes us human in the most profound, even biological issues. And, for example, when we are talking about trans-humanist philosophy, the fact of playing with our bodies, this is something that I think could become a real concern, because we don’t really know what we are doing, and I don’t think that it is supported by any democratic consensus, and these companies are moving forward with their agenda in a somewhat underhanded way.
ED: They are exploring. I think it is simply exploration. I mean, between the establishment of early Neolithic societies and the definitive uptake of cities as a social model, thousands of years went by, yet there are still some nomadic societies. There are still a few left. In other words, it’s a process that has lasted a very long time and today there are many societies, though now not as many, which are still convinced that they need to hunt, forage and keep moving. And settling down is like losing their very essence. Genetic improvements. We have been making genetic improvements for centuries. So, a cob of corn looks nothing like the original cob of corn. We were shorter uglier, less healthy and we didn’t live very long at all.
So, I think that if we apply this, what we find is that the effects of many of the technologies that we are trying out now will be felt in several generations’ time. Now, for example, we know that a technology we started using several decades ago, fossil fuels, has caused a huge problem and we have to make changes. And look how difficult that is proving to be.
DH: If we look at climate change, for our generations, who have never been involved in wars or extremely disruptive events, in recent decades, I think it is difficult, or it was difficult until the pandemic, to imagine disruptive factors, and discourse involving rupture in general.
ED: Raising the question of not using fossil fuels before we had developed the highly efficient, cheap solar panels we have now would be impossible. Now we can imagine that a country could say, we are not using fossil fuels, because the cheapest energy we can produce today is renewable energy.
DH: True, but I think here we can see the two sides. On one hand, I think, that part of the solution comes from innovation and we must keep the best of innovation, and part of the solution, in my opinion, should come from moderation, from us being more frugal users in certain areas, when they consume lots of energy. And we should produce cleaner energy, on one hand and consume less energy, on the other. The other thing about going back in time historically, is that it is interesting to see that there are certain historians, and anthropologists too, who have demonstrated that actually we lived a bit better before this revolution than just after. And this image that we have, this idea that farming actually has enabled us to be happier or at least to have more stable lives, this has actually made human beings more vulnerable than before.
ED: The life expectations of a hunter and forager scarcely reached thirty years of age. So, I think that the interesting thing is to understand that the transformation of human society has been the result of lots of different technologies that have had an impact over time. And that obviously means that we can’t say that we would be happier if we went back in time from an anthropological viewpoint.
Although possibly, this would be true because now our lives are much more complicated, stressful, etc. But now the stress is I can’t make it to the end of the month, when before the stress was the animal I need to hunt is going to kill me tomorrow.
DH: We have more stress thinking about whether the end of the world is nigh, as well as wondering whether we are going to make ends meet.
ED: In fact, this is a huge worry for the younger generations.
DH: Exactly. But the thing is that basically it is interesting to realize that innovation does not necessarily mean human progress and that is what I’m sticking with.
ED: I think that when innovation wins through, it is because of specific reasons, and the reason which normally means it wins through is because it is seen as being positive by a large part of society. So, I think that just as I found it difficult to explain to my wife’s grandfather that sometimes I stayed at home in front of my computer and that I was working, then trying to explain to someone from the Middle Ages that you and I are working when we are sitting here having a chat, is just not feasible. Work has changed totally.
DH: What I am worried about, among other things, are these highly centralized projects that affect the very core of who and what we are, and undemocratic projects run by very few people. Because the bottom line is the big today, and there more than a few, have control over things that are totally essential parts of who we are, but they are also run by one or two people.
ED: This vision of demonizing high-tech entrepreneurs is an easy option because we can actually see that there is a Satya Nadella, a Tim Cook, taking decisions, but it’s a bit of an idealized vision in practice. These companies do what they can with their vision of the regulation they have around them, or the competition that they have, or what the market allows them to do. So, of course, there have to be pioneers who have an idea, take it, develop it and put it on the market, and do things that make other companies follow them.
DH: What happens is that companies like Google really take advantage. They’ve created a marketplace which, in my opinion, sells small lots of our freedom, which I think should be directly banned.
ED: Hang on, let’s put this into context. When Google appears, when the concept of a new way of searching for things in large information sets appears, Google was a search engine.
DH: Yes, yes.
ED: As a search engine, it brings lots of very good things to society. Nowadays, the work of many, many people would be impossible and unthinkable without Google. When do things get complicated? When do you start to have objections?
DH: In 2003.
ED: When the model of “Let’s finance this with hyper-targeted advertising” appears. Exactly. Let’s segment advertising in a particular way, and all that. In reality, your problem appears in 2004 when Gmail arrives on the scene. And it is no longer a question of I know who you are according to a cookie. It is, I know who you are because you have said it is you and you have logged in. When we start to combine these kinds of things, that’s when it starts.
That’s when the problem starts and your problem gets much worse when somebody comes onto the scene, in this case, an entrepreneur on a university campus who creates a “hot or not”, a totally absurd social network which says this is hot, but this isn’t, which what it says is that, everything you do on this social network I’m going to collect it and I’m going to allow you to be targeted, about what you think about something. Your problem is not with the original company. Your problem is with how it evolves.
DH: I have nothing against search engines, I think they are great. I always use search engines. I don’t use Google, I use DuckDuckGo and Qwant which are alternatives which I think are a bit before Google’s original sin. And what I don’t want is to live in an ecosystem that, as you have just said in the case of Facebook, knows everything about you and takes advantage of this data to control you and your decisions. Mainly without you realizing.
There is something else that I wanted to talk to you about. It’s the so-called digital divide. We often talk about the digital divide as being a gap that separates people that have access to modern technology from people that don’t, and who then miss out on many opportunities and are excluded from many circles and who end up being left behind. The real digital divide for me is going to be, or it already exists, between people who are aware of how algorithms can control us and those who are not and those who don’t have this knowledge who decide, because it is more convenient, to hand over all aspects and facets of their lives to these algorithms and then lose huge amounts of control over their lives without realizing.
ED: What is worrying? I think that technology always evolves in the same way, as it becomes cheaper and easier. You can give a smartphone to almost anybody, but making that person understand that their smartphone is also used to see exactly where they are at all times and for many other things too. This circular vision of exactly what a person and their smartphone implies is much more complex. I think that what we have to do is make sure that these exclusions are sorted out as soon as possible and the key lies entirely with education.
DH: A large part of education should go on making people understand the ins and outs and the consequences of using technology, at least at user level.
ED: And not just at user level. As soon as you are surrounded by programmable objects. I think knowing how to program becomes really important.
DH: At user level, you need to understand everything you do when you use technology or sometimes simply when you are not aware that you are using it, but you are surrounded by it, this can have consequences. What we need to aspire to is to try to take advantage as much as possible of the benefits of technology. And right now, we are paying a hidden price very indirectly for many of the technological services we use, among other things, because many of them are free, and we don’t really know at end-user level, as consumers at least, how we are paying for them.
ED: Because you are not the user, you are not the user. You are the raw material. You are the product.
DH: Exactly, exactly. The adage of if you don’t pay for the product, you are the product makes people smile. But it is actually something that is really serious, that people need to understand a lot more. I think that we often think, or public authorities may fall into the trap of thinking, that teaching people about technology is providing a tablet to every pupil over five years of age, or that classrooms need to be full of screens, can convey the wrong idea that if you are hyper-technological, you are going to empower people.
ED: But it is not that teaching people about technology means mainstreaming technology. It means that somebody who is going to be surrounded by technological objects their entire life has to lose their fear of them and start to use them and become familiar with them as soon as they are old enough not to put them in their mouths. That’s how you introduce technology without having to teach technology as a subject. Adding technology as a subject means making it vertical, and that is not necessary. What is important is for technology to be a natural part of a history class, or a maths class, and in every class.
Why do we have a problem with fake news, for example? Because we teach children when they are little that truth comes in a book. So, what happens after that? They resort to very few sources of information because we have only taught them to use few sources. It might be a better idea to teach them when they are little that to find out what happened in the First World War, they need to go to a search engine, look for lots of information about the First World War and teach them to filter it.
DH: I totally agree with the last thing you have just said. However, I disagree with you in the sense that I think that the reason we have fake news, among other things, we have this because fake news circulates about six times faster than real news on social networks, and that means that the news that is going to create more emotions in us, which end up, on many occasions, being news that is not true, is going to get more exposure.
In terms of early education, I think that even from a neurological point of view, I don’t even think we are equipped to deal with certain processes that take place in a digital environment. I think that what these tools do, what they try to do, by whatever means, is to get children used to very fast brain reward cycles. Yet major innovations are born out of boredom, or at least also out of the ability to amaze us, which a large part of digital technology, in its current shape and form, is technology that makes us used to immediacy, which means that we don’t know how to, or we end up being very frustrated, if these rewards are not instant.
And I see this as being a problem mainly because I think it kills curiosity and the wow factor. And this is something therefore that I think we need to try to control at individual level as well, and in our children’s education.
ED: I think that it is something which technology gives you access to a whole lot of information, and that information is going to flow reasonably quickly, much quicker, much faster. So, what you have to do is get people used to the fact that actually, that flow of information is going to be in front of their eyeballs constantly and they have to learn to manage it.
I mean, what happens when you restrict, when you say I am not going to give a kid a mobile when they are little, is that they start to use it somewhere else, they start to use it at a friend’s house, and they end up using it badly. This is the same as with sexual education. What we have done is bring forward when it is explained to kids, and with technology, I think we need to do exactly the same, bring it forward so that they see it as being normal. Ok, yes, there are issues, and it does change the way our brain works. We have lost skills, and, in two generations from now, what will happen is what has happened to me, I am almost incapable of handwriting.
DH: I disagree a bit there, because I think that this sudden, fast outsourcing of many cognitive skills with obvious examples such as memory, a sense of direction, which have taken place so quickly, have meant that we have lost them or we are delegating them so fast that I think they make human beings extremely vulnerable.
ED: I don’t think you are giving the plasticity of the human brain the credit it deserves. The human brain has always been able to adapt to the environment around it. If not, if you know you can’t write my phone number down, you will make sure you remember it. I mean, I think that we are much better at adapting. The history of humankind is full of supposed problems.
Oh no! Someone has invented books. And now, when somebody reads, they get so absorbed in a book that they are not aware of what is going on around them. The TV gets invented and then it’s the TV, how are we educating our kids with TV, this is a disaster for the next generation, but nothing ever happens. We adapt.
DH: What worries me, seeing its power today, is the lack of control we have over technology and the fact that many people do not realize that more and more facets of their lives are being delegated to technology. I think that we need to find the keys to curtail it, limit it, to rebuild barriers that enable us to empower everybody to continue taking advantage of all the incredible things it can give us, but not to pay the high price we are paying in terms of freedom and perhaps it is true that, in some cases, this may mean giving up certain aspects or certain benefits that technology can bring.
ED: Regulation? Yes, immediate regulation possibly not now, probably later, more about sorting out problems. But yes, I do think it has to be done.