Google serves up answers to almost 6 billion searches every day. It selects that data based on over 200 factors. These curated responses, which multiple reports suggest favor Google’s own properties (maps, apps, docs, stores and advertisers), influence our speech patterns, global commerce, society, education, public discourse, political dialogs, not to your personal ambitions and our global consciousness. Are you ok with that? I’m not.
I have some ideas.
From revolution to domination
The birth of the World Wide Web was a blessing to the world. It was in every sense, a new “planetary technology” with immense transformative potential. It provided a single globally accessible virtual space to store and access knowledge, and invite and exchange news, ideas and opinions. It was a quantum improvement on what we had in the physical world—in the pre-WWW era, where access to information and knowledge of this kind could only be found in communal environments like schools, universities, libraries, town halls and newsstands, and was primarily shared through a variety of physical and intimate interactions with people.
Today, people across the world can create and distribute their own WWW HTML content. The physical aspects of transferring knowledge and information have become irrelevant. And that collectivization of digital information is revolutionizing multiple industry sectors, advancing research across the humanities and the sciences, and enriching our lives immeasurably.
To deal with the data explosion, search engines have come to our rescue. By crawling sites, categorising and ranking data via semantic analysis and identifying common themes, they add immense value to our everyday experience. But their output is hardly ‘neutral’ and much of the content ‘crawled’ is unregulated and potentially overdetermined.
So, how can we act on this?
If we understand the impact of limiting access to data and insight, and of targeted misinformation, we should be doing something about it.
If we were to establish an online regulator or gatekeeper, would the regulator strike a blue pencil through specific items of content? Yes? No? Perhaps? Would we simply be transferring authority from Google to the regulator? And how could we hope to keep such a regulator ‘sane’ and in touch with changing privacy and technology dynamics?
Governments are weighing in.
- Following Cambridge Analytica saga, the UK proposed an “Internet Safety Strategy” green paper. It looks at how government can ensure Britain is “the safest place in the world to be online”.
- The EU proposed regulating and preventing the dissemination of terrorist content online in 2018. And, more importantly, has put forward legislation that aims to uphold international market freedoms, “…addressing unjustified geo-blocking and other forms of discrimination based on customers’ nationality, place of residence or place of establishment within the internal market.”
If governments are able to regulate TV broadcasting then surely they can enforce regulation of the Internet. Or can they? How will this affect Google and other search engines? And is this desirable given the example that Dragonfly sets?
Meantime, there’s quite a bit you can do in your personal capacity. Your awareness of the power of search engines — and your active vigilance — can minimize their impact on you and your well-being, and on those around you. Importantly, it can help tailor the emergence of search engines of the future that are more informed, relevant, transparent, agile, functional, better designed and collaborative.
It’s important to have a sense of humour.
But the joke will be on us if we don’t become more vigilant—take better care of this incredible knowledge asset that the world is creating. Is Google evil? It’s famous ‘don’t be evil’ phrase can still be found in the last sentence of its corporate code of conduct. I’d like to believe it.
Meantime, we got moves, right.
Step 1: Don’t rely on a single stream of information. Go incognito. Get weaving. Make Beta.cliqz. Or Duck Duck Go. Try to reduce using single space hubs of information such as Social Media Applications.
Step 2: Question everything. As humans, we are naturally curious—follow your gut.
Step 3: Question the UI of search engines and demand more transparency. How would someone not familiar with how SEs work interpret and accumulate the output of SEs? Would they add a date, reference the author, the number of edits applied? It’s an interesting thought experiment.
Step 4: Check your biases. Try to identify whether the information provided by SEs contain triggers or content that amplifies or reduces your own biases.
Step 5: Is it relevant? Irrelevant information can make valuable information harder to find. Be aware of search fatigue and how this can impact your search for relevance. You have a voice – down-vote irrelevant data and surface the good stuff.
Step 6: Protect your psyche. Our emotional well-being can unintentionally be affected by the delivery of “good” or “bad” news. The digital speed at which we all now receive news can significantly alter the way we think and act in our everyday lives. SEs play a huge part in this. The new COVID-19 crisis has amplified this. Switch up your provider. Put yourself first.
Step7: Guard your dignity. We may suffer psychological distress or reputational damage as potentially embarrassing facts about ourselves are collected and openly shared (it’s usually embarrassing photos from 10 years ago…). Be more vigilant, think carefully about how you are navigating the Internet as a user.
Step 8: Be open, learn, act. People can hold conflicting values simultaneously. In an evolving digital world, “new” and “disruption” are common fare and the voices of millions of people, cultures and communities are being heard for the first time. Be open to this knowledge. Learn. And when you can contribute or mitigate, raise your own voice. See our DA blog here.
Too much knowledge?
There is an idiom that says “too little knowledge is a dangerous thing”. It is an open question as to whether more knowledge is safer. I believe the democratisation of information and knowledge will unquestionably add value to the human collective. When we see it threatened, we, as designers and thinking individuals, have a duty to protect this liberty.
We can do so by raising our concerns, making society aware of the pitfalls and insisting that search engines, (and other actors), implement more ethically refined systems design and and ensure best practices are maintained —factors that can improve how we navigate the present as well as help us create a better and safer future, faster.
This is a topic that is becoming more and more important. I would love to continue the conversation. Will the principles of data, research, of information, of sharing digital knowledge, change the way we design? Let me have your thoughts.
Having recently relocated to Germany from the UK, I was struck by the differences in the information that a Google search in each of the two geographies delivers. Perhaps to be expected, but an eye-opener, nonetheless! A reminder that that the algorithms and mechanisms leveraged to rank content, along with SE advertising models and the SE’s access to my personal data, all influence what search results I see.
What struck me was the following: to make an informed decision I want the best information but what if I am only getting a list of information that is rated most popular in a specific geography? How do I know if what I am reading online in the UK is also available in Sudan or Korea? How do I, as a user, know that the SE results suggested (through common or collective interest) have been derived from relevant or “sophisticated” sources? Or, disturbingly, what if this is data that Google suggested because I am of a certain race, age, nationality and currently living in Germany?
That’s pretty limiting. For me personally and for any business relying on online research.
These are simple thought experiments, and they raise important questions with delicate and difficult answers. Not to mention, with the recent global outbreak of COVID-19, these questions hold even more relevance. The genie is out of the bottle.
What does this mean?
It means that there’s potentially a lot of dark data out there—or at least, data that’s been made ‘dark’ for me. And it may be really insightful or worthwhile information. For me, all data in the public domain should be made available—by everyone.
Information hazards’ impact business and design
As part of a design agency, it’s vital for me to have a clear understanding of how people interact or engage with a client’s brand, products and services.
It’s important for us to truly understand what makes users satisfied. Online research and analytics are important indicators. Awareness of potential biases and data ‘skews’—what experts now call “information-hazards”—intentional or not, must now become part of gaining such clarity.
Information hazards are often subtle and undetected, and they are occurring with greater frequency than ever before as a direct result of how SEs operate and mesh with other data-intensive services across the Internet. While information hazards are very different to physical threats, they are potentially as dangerous given the importance of the data used by businesses for analytics, marketing and personalisation … and by individuals to make daily decisions, including political ones which can have a broad and long term impact on economies, cultures and individual liberties.
For example, while travelling around Europe over the last year, I was carefully following the political dialog back home (i.e., Britain and Brexit) online. The information curated for me as I crossed borders made clear to me that there are subtle differences in the content that was being suggested or highlighted as ”relevant” for users within certain contexts. Will the UK and EU’s steps to regulate data privacy to curtail this aspect of online behavior by SE and other actors be enough? Recently, the US government is in active talks with several large technology corporations such as Google and Facebook to explore how location data could be used to combat the COVID-19 pandemic. If this goes ahead, this will be another important paradigm shift and milestone reached.
Be aware, be alert
I believe that we—anyone that’s part of the global digital workforce or the general public, experts in digital design (software engineers, UI designers, researchers, usability engineers, system analysts, sociologists) amongst many other fields (e.g., civil and legal law), the new generation (young people who may not know better), and every single business engaging with customers digitally—should be discussing search engines, both their role and impact.
There is art, science—and commercial interest—all at work here.
So, how can you act to protect your right to know?
- Get to grips with privacy. Understand your options and exercise your rights.
- Understand what search options exist, beyond the obvious. Use the new tools that are emerging – e.g., Duck Duck Go – which is a search engine entirely focused on privacy.
- Demand UIs to provide greater transparency. Would you make better ‘click’ decisions if you knew who created the content, when, how, and where?
- Be aware of how every click you make can scale and tip your outputs.
Join me in part 2 as I dive deeper into what drive SE results and expand on the options we have to ensure the ideal of democratizing information and knowledge remains a reality.