Search engines of the future
I love Google. Google is a technical marvel. A great achievement. I could not do my job without it. But in its evolution to today’s 90%+ dominance of search traffic, is Google gaining a monopoly on information?
As Cliqz, the builder of an independent search engine beta.cliqz.com, notes in this article: “If you do not see why someone with a 93% information monopoly constitutes a risk to democracy and freedom, then you should not be reading blog posts on the Web but maybe some history books instead”.
Are search engines bending reality by presenting a popular view? Are they limiting our exploration, our potential, our reach and our rights? These are just some of the questions I raised in my previous blogs.
I believe the Internet is a living, dynamically growing body of information, a gateway to the world’s experiences, thoughts, important discoveries and advances.
At present, the algorithms and mechanisms of the search engine giants, led by our clicks and online behavior, are shaping this invaluable global resource.
“By making Web pages close or distant (e.g., page ranking), the search engines control Internet architecture. They distribute the space according to users’ needs. … [and] Google takes up the strategic position located at the heart of this geometric structure, and controls Internet space,” says Louise Druhle in her Critical Atlas of the Internet hypothesis.
Does the content we are pointed to by Google offer a true reflection of our world or a convenient one preferred by a few with their own agendas?
A narrow path shapes our ‘truth’
When we browse online, we go on a clickable journey, from hyperlink to hyperlink; we satisfy our ‘wanderlust’, we explore… 15 years ago, users had a fair amount of independence and a huge favourites/bookmark list. Now, nearly 80% of users often end up heading in the same direction, to the same webpages, to the same services and continue to keep the “wheels” spinning—i.e., the ecosystem of A/B content displays and tailored advertising that drives sales for advertisers and revenue for Google.
And thanks to large global players such as Google, we are now more traceable than ever before. Our virtual movement maps and shapes the WWW’s surface. Our Internet is tailor-made… and when everyone is being served up the same results and we accept what we are given without question, it becomes a shared “truth”.
If you are only using Google for your daily search needs, let me ask you: who did you vote for in the last election and how did you choose your insurer? I’m betting the Internet—and Google—had something to do with it.
Tech and bad actors blur our vision
Right now, the rapid evolution of digital platforms, services and technologies are blurring our vision. The underlying technology is expanding and advancing at an incredible rate. Even experts in the field are surprised by it. It’s easy to be “hypnotised”. Meanwhile, key players, institutions and stakeholders (whether they have good intentions or not) are all leveraging advances in data manipulation, machine learning and AI as quickly as they emerge to advance their own agendas.
To us, the ordinary users, their strategies are often more subtle than we can fathom.
- Cambridge Analytica used Facebook data to create psychological profiles of users, then targeted them with propaganda techniques—including content and ads on Facebook that contained loaded political and emotional messages that played to their deepest fears and beliefs—to influence the 2016 US elections.
- Does rigging data influence public opinion? Dr Robert Epstein & Ronald E Robertson tested several thousand participants in a series of experiments to see if online search results influence opinion. It does.
- And then there’s Dragonfly, the blatant nod by Google in the direction of China’s human rights abuses. The Dragonfly project is Google’s censored search engine built for China. Disturbing information revealed by ex-Google employee Jack Poulson include Google-constructed blacklists for search terms containing phrases such as “human rights” and “Nobel prize”, code written to show only Chinese air quality data from an unnamed source in Beijing, and the linking of telephone numbers to searches. In July 2019, Google said it had terminated Has it?
- Google’s technical ‘innovations’, such as this ‘investor driven design’ that makes search results all look like ads, and technical modifications to Chrome, which are causing a stir in the software community, are being noticed. It’s brought up some burning questions regarding governance and due process which Google is being forced to answer.
- Facial recognition technology is also becoming a threat. Facebook has been ordered to pay a $550 million fine to settle a US lawsuit on the matter. As one commentator noted: “This biometric data is so sensitive that if it is compromised, there is simply no recourse… It’s not like a social security card or credit card number where you can change the number. You can’t change your face.” Use of this technology is something that has been debated in the EU for some time.
So, how can we act on this?
If we understand the impact of limiting access to data and insight, and of targeted misinformation, we should be doing something about it.
If we were to establish an online regulator or gatekeeper, would the regulator strike a blue pencil through specific items of content? Yes? No? Perhaps? Would we simply be transferring authority from Google to the regulator? And how could we hope to keep such a regulator ‘sane’ and in touch with changing privacy and technology dynamics?
Governments are weighing in.
- Following Cambridge Analytica saga, the UK proposed an “Internet Safety Strategy” green paper. It looks at how government can ensure Britain is “the safest place in the world to be online”.
- The EU proposed regulating and preventing the dissemination of terrorist content online in 2018. And, more importantly, has put forward legislation that aims to uphold international market freedoms, “…addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market.”
If governments are able to regulate TV broadcasting then surely they can enforce regulation of the Internet. Or can they? How will this affect Google and other search engines? And is this desirable given the example that Dragonfly sets?
Meantime, there’s quite a bit you can do in your personal capacity. Your awareness of the power of search engines — and your active vigilance — can minimize their impact on you and your well-being, and on those around you. Importantly, it can help tailor the emergence of search engines of the future that are more informed, relevant, transparent, agile, functional, better designed and collaborative.
It’s important to have a sense of humour.
But the joke will be on us if we don’t become more vigilant—take better care of this incredible knowledge asset that the world is creating. Is Google evil? It’s famous ‘don’t be evil’ phrase can still be found in the last sentence of its corporate code of conduct. I’d like to believe it.
Meantime, we got moves, right.
Step 1: Don’t rely on a single stream of information. Go incognito. Get weaving. Make Beta.cliqz. Or Duck Duck Go.
Step 2: Question everything. As humans, we are naturally curious—follow your gut.
Step 3: Question the UI of search engines and demand more transparency. How would someone not familiar with how SEs work interpret and accumulate the output of SEs? Would they add a date, reference the author, the number of edits applied? It’s an interesting thought experiment.
Step 4: Check your biases. Try to identify whether the information provided by SEs contain triggers or content that amplifies or reduces your own biases.
Step 5: Is it relevant? Irrelevant information can make valuable information harder to find. Be aware of search fatigue and how this can impact your search for relevance. You have a voice – down-vote irrelevant data and surface the good stuff.
Step 6: Protect your psyche. Our emotional well-being can unintentionally be affected by the delivery of “good” or “bad” news. The digital speed at which we all now receive news can significantly alter the way we think and act. SEs play a huge part in this. Switch up your provider.
Step7: Guard your dignity. We may suffer psychological distress or reputational damage as potentially embarrassing facts about ourselves are collected and openly shared. Be more vigilant, think carefully about how you are navigating the Internet as a user.
Step 8: Be open, learn, act. People can hold conflicting values simultaneously. In an evolving digital world “new” and “disruption” are common fare and the voices of many people, cultures and communities are being heard for the first time. Be open to this knowledge. Learn. And when you can contribute, raise your own voice. See our DA blog here.
Too much knowledge?
There is an idiom that says “too little knowledge is a dangerous thing”. It is an open question whether more knowledge is safer. I believe the democratisation of information and knowledge will unquestionably add value to the human collective. When we see it threatened, we, as designers and thinking individuals, have a duty to protect this liberty.
We can do so by raising our concerns, making society aware of the pitfalls and insisting that search engines, and other actors, implement more ethically refined systems design and practices—factors that can improve how we navigate the present as well as help us create a better future, faster.
This is a topic that is becoming more and more important. I would love to continue the conversation. Will the principles of data, research, of information, of sharing digital knowledge, change the way we design? Let me have your thoughts.