Throughout history, technological revolutions have been pivotal moments in shaping societies, economies and cultures. From the advent of the wheel to the integration of digital systems, humanity has continually sought to enhance its existence through innovation. Yet, understandably, these pivotal changes have not been devoid of apprehension, fear and hesitation regarding how they will affect society.
With every new advancement, we have faced uncertainties about the opportunities they present and the threats they might impose. Today, we stand at the precipice of another such revolution: the era of Artificial Intelligence (AI).
This first instalment in our three-part AI blog series explores the technological milestones that have fundamentally redefined human society. Each era has sparked concerns, disrupted livelihoods and changed how we perceive the world. Yet, with each new era, humanity has adapted and thrived. The rise of AI can be seen not as an unprecedented leap into the unknown but as a continuation of this long-standing tradition of progress.
The Industrial Revolution began in the late 18th century and was a turning point in human history. With the invention of the steam engine, mechanised production processes and the establishment of factory systems, human society experienced a radical shift in its approach to labour and manufacturing. This revolution laid the foundation for modern economies, enabling mass production on a scale never imagined.
However, the rise of machinery had a significant social impact beyond economics. Traditional artisanal work, such as weaving, pottery and blacksmithing, declined as mechanised processes replaced manual craftsmanship. Large portions of the population moved from rural areas to urban centres in search of factory jobs, leading to the decline of small-scale, community-driven craftsmanship. Moreover, as factories prioritised heavy manual labour, the workforce became predominantly male, driving many women out of the workforce and back into the home. Factories were mainly populated by men and children, who were considered cheaper labour sources.
However, the Industrial Revolution's legacy is not solely one of lost jobs but of economic growth, increased productivity and the rise of new industries. Societies eventually adapted by creating new types of employment and new social norms centred around this mechanised reality.
As the Industrial Revolution gave way to the 20th century, we entered what can be referred to as the Analog Era—a period defined by the widespread adoption of telecommunication technologies like the telephone, radio and, eventually, television. These inventions transformed how people communicated and consumed information, connecting the world.
The telephone, for example, redefined personal and business communication, collapsing distance barriers. Radio introduced mass communication, enabling the real-time dissemination of news, culture and entertainment to the masses. Television followed, combining visual and auditory experiences to bring shared cultural narratives directly into people’s homes. This era also marked a shift in the workforce dynamic, allowing women to re-enter the workforce in roles such as secretaries, typists and switchboard operators. The expansion of telecommunications created administrative roles accessible to women, contributing to their increasing participation in the workforce.
Yet these changes, like those before them, were met with both excitement and anxiety. There were fears of media and political influence, the breakdown of the traditional family and cultural homogenisation. But this era of connectivity laid the groundwork for what would come next: the Digital Revolution.
The emergence of the Internet and the subsequent rise of digital technologies represented a revolution that few could have predicted. With the advent of the World Wide Web in the 1990s and the proliferation of personal computers, smartphones and Wi-Fi networks, the Internet fundamentally transformed how we live, work and connect.
Information flowed freely across borders for the first time, enabling a global knowledge-based economy to rise. Online platforms redefined commerce, media and social interaction, creating entirely new industries while disrupting established ones. The internet’s democratising effect also revolutionised education, making knowledge more accessible. Additionally, the Internet Era enabled a shift toward remote work, making it possible for people, particularly women with young children, to balance their responsibilities while remaining in the workforce. This shift allowed flexibility that previous eras couldn't accommodate.
However, the digital revolution's influence was not uniform globally. Developed countries experienced rapid technological growth and digital transformation, while developing nations faced challenges due to limited infrastructure and access to digital tools. This disparity highlighted a growing digital divide that widened inequalities in economic opportunities, education and information accessibility.
Despite the opportunities, the digital era brought about new uncertainties. Concerns about data privacy, cybersecurity and the digital divide highlighted the potential pitfalls of this newfound connectivity. Moreover, as automated processes and digital platforms replaced traditional jobs (for example, supermarket self-service checkouts replacing cashiers), fears of widespread job displacement resurfaced.
The emergence of AI is not a sudden disruption but a logical next step in the continuum of technological progress. AI is already embedded in our lives in ways we often overlook, from voice-activated home assistants to recommendation algorithms shaping our online experiences. We rely on machine learning systems for everything from navigating traffic to managing logistics and optimising marketing campaigns. In many ways, AI is already an extension of the digital era’s promise to enhance efficiency and deliver better outcomes.
However, the anxiety surrounding AI is unique in that it challenges fundamental concepts of human autonomy and agency. Unlike previous technologies, which expanded human capacity, AI systems can learn, adapt and make autonomous decisions. This raises new ethical questions: What jobs might AI replace? Could autonomous systems surpass human intelligence? What might happen if AI decision-making becomes opaque or biased?
Yet, we must remember that similar fears have accompanied every major technological shift. Just as the steam engine did not render human labour obsolete, nor did the internet destroy all traditional industries, AI is not a harbinger of human irrelevance. Rather, it presents an opportunity to rethink how we work and live, challenging us to redefine our roles in a technologically augmented society.
Technological revolutions have consistently reshaped how we live and work, from the mechanisation of labour to the advent of global connectivity. With each new era, society has faced deep uncertainties and challenges. Yet, history has shown that while disruptions are inevitable, so are adaptation and progress.
As we now confront the rise of AI, it is essential to understand that our current fears are not new. They are a natural response to profound change. However, the threats posed by AI are real and must be acknowledged. For example, AI is growing in intelligence and can perform tasks that previously required human labour. We see this in automated systems managing warehouses and AI-powered customer support. As AI continues to develop, there is a risk of certain jobs becoming redundant, which will require thoughtful adaptation.
In the second part of this series, we will explore AI’s current impact on marketing, work and life, focusing on the balance between automation and human agency. By examining this balance, we aim to understand better AI's role in shaping the future.