skip to Main Content


Humans have developed our civilizations through bipedals, language, cooperation and the use of technologies for millennia. Not long time ago, archaeologists discover that at 1.8 million years ago, Homo erectus in Dmanisi, Georgia was still using simple chopping tools while in West Turkana, Kenya, according to the study, the population had developed hand axes, picks and other innovative tools that anthropologists call “Acheulian.” (1).

One characteristic of most technological innovations is that they not only benefit humans but also pose certain dangers. The knife that served to kill hunting animals and build huts could also be used to kill humans. The commissioning of electricity meant many electrocution deaths as did the mortality it caused and still causes the use of cars. Not forgetting the atomic energy.

The big change in new technologies in the 21st century is their speed of development and implementation. The digital world invades most human activities. The coronavirus pandemic is demonstrating the importance of these technologies, especially Artificial Intelligence, without it having been possible to get vaccines in almost 10 months. With the help of smartphones and applications developed in record time, it would not have been possible to test, track and trace patients with symptoms or contagion. With tools like telemedicine, millions of patients have been seen remotely. We could say that the implementation of digital health technologies has accelerated what could have happened in about 10 years in a single year.

Not only in health, the changes have also occurred in other sectors such as education where online classes have been vital to continue teaching or in stores where millions of people are shopping online. Telework has allowed the continuity of activity many companies and the virtual conferences that were previously exceptionally used have become absolutely necessary.


We could cite many more examples, but what needs to be seen is that we are in a paradigm shift of humans, which may be the most important change in human history, for which we are all ill-prepared. Our brain is the same practically the same one that existed thousands of years ago. Novel research from the Max Planck Institute for Evolutionary Anthropology, Germany, reveals that modern humans have a distinct brain and skull architecture which likely fully developed around 40,000 years ago.

In these 40,000 years with the knowledge that anthropologists can obtain, human activity was very basic until a few centuries ago. Fundamentally survive, hunt, catch fruit from trees and fight wild animals and fend off attacks from enemy tribes. The progressive development of agriculture and grazing, which gradually moved from a situation of appropriation of nature to a production one, is known as the agricultural revolution.

This process began approximately 10,000 years ago between the peoples of Mesopotamia and Egypt, and was later repeated in India (6000 BC), China (5000 BC), Europe (4500 BC), Africa (3000 BC) and America (2500 BC)

But the real agricultural revolution was a profound transformation suffered by agriculture and livestock, from the 18th to the 19th century. This happened in Europe, mainly in Britain.


We have a problem. Our brain is not programmed for changes, its goal is still just to survive. The really significant changes have occurred over the past centuries. That is, 1.25% of our history. Little time for our brain to evolve as an organ, but one of the main evolutionary adaptations of humans compared to other species is the capabilities of our brain and particularly the cerebral cortex. For example, the average human brain has an estimated85–100 billion neurons (2) and contains many more glial cells which serve to support and protect the neurons Each neuron may be connected to up to 10,000–12,500 other neurons, passing signals to each other via as many as 100 trillion synaptic connections, equivalent by some estimates to a computer with a 1 trillion bit per second processor. (3)   

The blue brain shows the globular shape of present-day human heads. In contrast, the red skull shape of a Neandertal, like the earliest Homo sapiens fossils, is elongated.

Image credits Simon Neubauer, Jean-Jacques Hublin, Philipp Gunz / MPI EVA Leipzig.


Let’s go back to the world of technological innovation. The experts talk about the 4 technological  revolutions since 1784 that have shaped our world. Industry 4.0 is where are now.

Industry 4.0 has been defined as “a name for the current trend of automation and data exchange in manufacturing technologies, including cyber-physical systems, the Internet of things, cloud computing and cognitive computing and creating the smart factory”. On of the challenges is how to move from Cyber-Physical Systems ( CPS) to Cyber-Human Systems ( CHS) (4)

Humans are a vital element, however, skilled production personnel have largely been designated as data receivers in Cyber-Physical Systems (CPS) of Industry 4.0. A renewed focus on the human worker who completes significant portions of manual value-added content in an organization through Cyber-Human Systems (CHS) is allowing humans to perform their jobs more safely, efficiently, and supporting enhanced control and quality monitoring of manual manufacturing tasks. There is a need for a unified complementary framework of CHS and CPS to guide the implementation of future smart systems.(5)

© 2017 The Authors. Published by Elsevier B.V. Peer-review under responsibility of the Scientific Committee of NAMRI/SME.  A complementary Cyber-Human Systems framework for Industry 4.0 Cyber-Physical Systems (6)

In fact Industry 4.0 is a big step forward in process automation taking advantage of the technologies that exist in the market, Artificial Intelligence, Machine Learning, Blockchain, Big Data analytics, Robotics, etc. in manufacturing, providing services, in retail as well as sectors such as education and health. In principle, this automation focuses on these advanced technologies, but the issue of humans is in the background. In a way and logically, the emergence of 4.0 forces us to adapt to technologies. Progressive adaptation but lacking educational strategies. (6)

Source : A complementary Cyber-Human Systems framework for Industry 4.0 Cyber-Physical Systems. Up loaded by Laine Mears


We know a lot about technologies, but we know little about humans, as individuals and as a society. Just as we know more about the space surrounding the land than the bottom of the sea and its inhabitants. It is more profitable to invest in exploring space than exploring the sea bottom.

The next figure is an interesting proposal on the likely evolution of 4.0 in selecting the optimal operative procedure for the implementation of smart systems and for maintenance with the best human-machine interactions possible from a deep reflection by industry experts. It is an integration of technologies and the relationship between man and machine and the need for new skills and talents.

Source: Smart Society and Artificial Intelligence: Big Data Scheduling and the Global Standard Method Applied to Smart Maintenance, Engineering. ( 7)


The debate and research around the human relationship with machines has only just begun. The truth is that the future of humanity is linked to technologies and unlike previous stages of innovation, are technologies that in a certain sense are “alive”, that is, improved and self-transformed as is the case of Artificial Intelligence or robotics. The challenge is not the machines or the technologies themselves, the challenge is how to acquire the skills and abilities to operate them. But above all as citizens and as a society how we address the changes and the impact they will generate on our civilization.  We talk about it in the next article…


  1. Lepre, C., Roche, H., Kent, D. et al. An earlier origin for the Acheulian. Nature 477, 82–85 (2011).
  2. Number of Neurons in a Human Brain. (accessed on 11 December 2018).
  3. The Human Memory.  (accessed  on15 January 2019).
  5. Krugh, Matthew & Mears, Laine. (2018). A complementary Cyber-Human Systems framework for Industry 4.0 Cyber-Physical Systems. Manufacturing Letters. 15. 10.1016/j.mfglet.2018.01.003.
  6. (PDF) The Process of Evolution, Human Enhancement Technology, and Cyborgs. [accessed Jan 14 2021].
  7. Smart Society and Artificial Intelligence: Big Data Scheduling and the Global Standard Method Applied to Smart Maintenance,Engineering, Ruben Foresti, Stefano Rossi, Matteo Magnani, Corrado Guarino Lo Bianco, Nicola Delmonte,

Collaboration at the conceptual level involves:

Awareness – We become part of a working entity with a shared purpose

Motivation – We drive to gain consensus in problem-solving or development

Self-synchronization – We decide as individuals when things need to happen

Participation – We participate in collaboration and we expect others to participate

Mediation – We negotiate and we collaborate together and find a middle point

Reciprocity – We share and we expect sharing in return through reciprocity

Reflection – We think and we consider alternatives

Engagement – We proactively engage rather than wait and see

Collaboration life-cycle – source AIIM

Digital deficits: People’s cognitive capabilities will be challenged in multiple ways, including their capacity for analytical thinking, memory, focus, creativity, reflection and mental resilience

A number of respondents said people’s cognitive capabilities seem to be undergoing changes detrimental to human performance. Because these deficits are found most commonly among those who live a highly digital life, they are being attributed to near-constant connectivity online.

The changing world of work

New modes of learning

The future belongs to the motivated

Skill sets for the future

1. Ideation: The ability to come up with new ideas about a topic. Here, the number of ideas matter and not their quality or creativity.

2. Decision-making: The ability to understand the cost versus benefits of a potential idea or action and choosing the most appropriate one.

3. Originality: The ability to come up with unique ideas on a given topic to creatively solve a problem.

4. Active learning: Using learning principles/instructional methods to come up with procedures to teach new things.

5. Systems evaluation: The ability to identify indicators of system performance and the actions needed to improve the performance relative to the goals of the system.

6. Learning strategies: The ability to understand the implications of new information for current and future problem-solving and decision-making

7. Complex problem-solving: Identifying complex problems and reviewing related information to develop and evaluate options and implement solutions.

8. Critical thinking: Using logic and reasoning to identify the strengths and weaknesses of alternative solutions, conclusions or approaches to problems.

9. Systems analysis: Determining how a system should work and how changes in conditions, operations, and the environment will affect outcomes.

10. Deductive reasoning: The ability to apply general rules to specific problems to produce answers that make sense.

Challenges AI

Back To Top