11 minute read

The Problem of ‘Omnipotence’ and ‘Omnipresence’ in Sectors and their Intersectional Relationship

Next Article
Technology

Technology

Regulatory Sovereignty in India: Indigenizing CompetitionTechnology Approaches, ISAIL-TR-001

for centuries against capitalism and its colonial guises. Leanne Betasamosake Simpson reflects on rationality but from the perspective of human meaning. She affirms a meaning that “is derived not through content or data or even theory but through a compassionate web of interdependent relationships that are different and valuable because of difference.” Why difference exactly? Because only by respecting difference do, we stand any chance of not “interfering with other beings’ life pathways” and their possibilities for autonomy.

Advertisement

To add on the same, it becomes important to estimate the role of societies. There cannot be absolute technocratic aspects. There is also a big need to address such a kind of issue, which – at least for India being the target of assessment – must be dealt with developing solutions, which decolonise – and provide affirmative solutions catering to Indian interests, realities and ecosystems, combined. The same has been dealt in the forthcoming sections.

The Problem of ‘Omnipotence’ and ‘Omnipresence’ in Sectors and their Intersectional Relationship What are Omnipotence & Omnipresence?

• Omnipotence is the quality of having unlimited or very great power. Omnipotence is a virtue that is associated with the idea of unlimited potential to execute. • Omnipresence is the property of being present anywhere and everywhere, unlimitedly. How that “unlimited” is defined and is structuralized, in both of the cases, is a matter of review. Ben Thompson proposes that the rise of computers made humans omnipotent (Computers can do anything), the rise of the internet made humans omniscient (Computers know everything) and the rise of mobile computers – most particularly smart phones –made humans omnipresent (Computers are everywhere) (Thompson, 2014). The undesirable aspects of omnipresence can be summed up by a simple hypothetical example that may not be hypothetical anymore: imagine a network of spies that are constantly measuring certain aspects of your existence, like your health, your emotions, what you are doing, and more. This information is then be sold to someone that may find value from it - like an

advertising agency that can use this information to guess what products you are more likely to buy, and start sending you pamphlets and magazines related to those products.

How algorithms are shaping our life

From the moment we wake up to the moment we go to sleep, an algorithm tracks and logs details about us, and uses this information to suggest actions and activities to us (Exposure Labs, 2020). In 2018, a technology magazine titled The Verge reported on an internal video made by Google which envisioned a future where algorithms would have complete access and control of data, and how this data could help Google guide individuals to fulfil certain goals or actions that are in line with Google and the individual’s missions. For example, an environmentally conscious individual may be suggested to buy groceries from a local vendor and opt for an Uber Pool in order to minimize their carbon footprint. While the complete control of a user’s data is portrayed in a good light in this example, this video also touches on how the algorithm can shape user opinions ahead of major decisions like elections to bring the world more in line with Google’s vision. Algorithms are already doing something like this today by: - Telling us what news articles to read, and based on the ones we spend more time on, it’ll serve us ads later on in the day.

These ads may shape our political perspectives, understanding of basic and complex issues and thus may even decide the outcomes of our country’s next election. - Algorithms that are being used in healthcare settings may decide which medicine we should be prescribed, and what health complications we may be prone to in the future.

Recently an algorithm was deployed which could assess a patient’s risk of suicide (Goldhill, 2018), while algorithms are also being used to study and point out cancers and tumours on MRI scans and reports (Oren, et al., 2020). Your life may literally be dependent on an emotionless machine within the next 50 years. - Algorithms are also being used to make hiring and firing decisions, so in a few years your livelihood may also be dependent on how good your resume looks to a machine.

36

Regulatory Sovereignty in India: Indigenizing CompetitionTechnology Approaches, ISAIL-TR-001

The growing usage of algorithms everywhere has increased privacy concerns as well. How your data is captured and stored, and who has access to this data is a cause for worry for many different individuals. Imagine if the algorithm from the hospital wrongly marks you at suicide risk, and this information is passed on to the algorithm that is being used to select candidates for an interview for your dream job, another example is if an algorithm is used to analyse which employees or candidates have higher probabilities of undergoing a major life-change within the next year (like pregnancy, marriage, etc). These algorithms may discard your candidacy because it does not see you as a long-term addition to the workforce.

Ubiquitous Computing

Ubiquitous computing refers to the incorporation of technology “everywhere, everything, all of the time”, it bears the idea to make computing resources available anytime and anywhere, freeing the user from the constraint of interacting with ICT devices explicitly via keyboards and screens. Ubiquitous computing requires the creation of computers that are small enough to be embedded into everyday objects, which will subsequently augment the objects utility and purpose, making it more useful and capable, this is something that is becoming a reality already, progress in semiconductor design has enabled the miniaturization of computers to the extent that we can now wear them on our writs. The peak of ubiquitous computing will be when services are made available irrespective of the object or platform that is being used (Sen, 2010). Presently we are using computers to manipulate and access services in the world around us, ubiquitous computing will enable the transformation of the world into a network of computers itself, enabling humans to carry out those functions that were previously done with a desktop system to now be done without the need of any tools. While ubiquitous computing can be largely beneficial, there are certain challenges that are associated with it: 1. Impact on Privacy: Ubiquitous computing will require daily objects being embedded with computers that can procure, process and store sensitive user information and data. If the accessibility of this data is not strict then it will be immensely

dangerous to all. For example, if a UC System is being used by a medical team then the who sees the information exchanged, how it is exchanged and where the information is stored will are issues that need regulating and safeguards to prevent the exposure of such information. If measures and safeguards are not baked into the systems from the beginning then the user security will be put at risk. 2. Impact on Society: While UC implementations themselves are not expected to have a negative social impact, it is more dependent on how these technologies are implemented. For example, if more surveillance computers are set up in one community compared to a different one, then the implications can result in over policing. The price and pre-requisites for the use of UCs will also play a factor in how these technologies impact society. Those that are not familiar with the usage and leveraging of UCs will be put at a disadvantage from an early stage. Further, those communities that do not have access to these technologies will also face repercussions for not having access to these disruptive technologies. 3. Impact on Economy: It is expected that UC will have a beneficial effect on industry, specifically in the areas of production, logistics and commerce. UC can help sellers target customers better and help buyers make better decisions, it can also help in the automation of certain processes that will make the production process more resource efficient. However, the potential downsides of the same are that many workers may be replaced by automated computers, even outside the factory setting, the implementation of UCs can transform the workplace into an area of high surveillance and blur the boundaries between private and personal life.

The goal of ubiquitous computing is to make technology as omnipotent and omnipresent as possible, and while this has the potential to improve productivity and ease-of-living, it also has the potential to infringe on privacy and negatively impact the lives of many individuals. If ubiquitous computing is to come into then certain safeguards and regulations are necessary.

38

Regulatory Sovereignty in India: Indigenizing CompetitionTechnology Approaches, ISAIL-TR-001

The Problem with Omnipotence and Omnipresence

As mentioned above, humans have historically had the power to modify or alter the instructions that they were handed, but the current Blackbox nature of algorithms and the institutions that deploy them, our power to assert control is limited (Paine, 2017), further the continuous development and incorporation of smart devices and technologies into our daily lives is reducing our autonomy over our own data and privacy. Further, there are pertinent questions in relation to the accountability and responsibility of disruptive technologies as a whole. The fact of the matter is that the current legal infrastructure of India does not give the subjects of these algorithms (i.e., the users of the services) the power to challenge these algorithms as they are mostly deployed by large corporate technology companies like Facebook, Google and Amazon. The pervasive nature of these algorithms in our daily life has begun influencing our decisions and interactions, and although true UCs are a way away, smart speakers, and other Internet-of-things devices are becoming more and more a reality every day.

The importance of safeguards

The implication of unregulated, omnipotent and omnipresent technologies are as follows: - Lack of regulated data standards can lead to wide scale spread of embedded biases – leading to loss of opportunity - Lack of regulated privacy standards can allow technologies to track, record and share personal details – leading to loss of ability to control one’s information, affecting other aspects of their lives - Dominance of algorithm-based content suggestion can censor out opposing viewpoints or perspectives, leading to radicalisation, incomplete understanding of critical social issues and perpetuate socially harmful or disrupting activities. o As a corollary the incorrect censorship of harmless

content could lead to the detriment of those careers that depend on these technologies (influencers, freelancers, etc.)

- Lack of safeguards can result in incorrect predictions that will have disastrous results.

- The adoption of algorithms in public functions like police allocation and recidivism prediction in the United States of

America has led to inequitable and prejudiced policing and jailing. Similar effects may be felt in India as it has similar social divisions and prejudices. - Lack of guidelines on accountability and responsibility can potentially lead to aggrieved parties being harmed without recourse While disruptive technologies may unlock great potential for all individuals, there are problems that grow in severity and can be detrimental if not kept in check, especially when the novel problems and issues that have arisen after the development of UCs and IoT technologies. The effects of a lack of regulation is already visible in India through the increasing division on the basis of political and religious lines that is supported through the spread of fake, sensationalised and politically manipulated information, most notably on Social Media platforms operating in the country. We are reaching a stage in technological development where the reliance on algorithms has reached a high point, that is only rising. As many technology strategists have warned, technology may soon be controlling states, rather than the other way around, something similar was even visible in the Arab Spring phenomenon of the 2010s (O'Donnell, 2011). In order to prevent this and keep the omnipotence and omnipresence of disruptive technologies in check, adopting a regulatory approach is a deemed necessity.

Differentiating Neoliberal Materiality and Technological Evolution

Omnipresence and omnipotence are usually led through the phenomenon of neoliberalism coupled with materialism – that the advent of technologies like AI matters as to how much exertion of influence and dependencies can be created upon the data subjects – usually human consumers. The UNESCO draft recommendation on the ethics of artificial intelligence notes down this phenomenon as well quite reasonably (UNESCO, 2021). Now, technological evolution itself has many parallels. It means that the way evolution happens, is different from how it is observed. Nevertheless, technological evolutions in various

40

Regulatory Sovereignty in India: Indigenizing CompetitionTechnology Approaches, ISAIL-TR-001

countries show what kind of interests and visions are led behind such kind of transformation. Technological evolution and natural evolution are different, and must not be coincided. Now, the problem with a neoliberal approach to omnipotence and omnipresence is that it exploits human dignity and can also be considered responsible for deracinating the extent of technology distancing. Considering that the extent of the ontological and epistemic roots of neoliberal materiality is not controlled shows that while omnipresence and omnipotence are reasonable conceptions, in artificial intelligence ethics, a cautious approach must be applied in regulatory theory to handle the impact of the imposed omnipotence and omnipresence. The reason is that omnipresence and omnipotence can be considered as imagined perceptions under the neoliberal approach of technology ethics, which can possibly be weaponised to create conflict economies and polities. The examples of the Syrian civil war in 2011, the Blackberry riots in London in the 2010s, the situation in Afghanistan and other parts of South Asia, where riots occur show how the factor of “presence” and “potential” can be abused.

Does it mean technological evolution must be prevented? No. There may not be the perfect methods of ensuring how such evolutions are not disruptive. The aesthetics would matter, but the pragmatic impact needs to be assessed carefully, which again – can have economic, social, individual, psychological, political and even legal impacts. Risks exist, and therefore, studying them in a much effective manner is important.

This article is from: