New and emerging technology, such as collaborative robots that can work alongside or even independently of people, looks set to boost productivity and innovation, but may also pose new risks to workers’ health, safety and wellbeing.
Over the past 250 years, we have had three industrial revolutions, from the 18th century of steam/water power, to the 19th century of electricity and mass production, through to the 1970s computer processes and automation. The first two revolutions lasted about 100 years each, and the third took about half that time.
We are now living in a fourth industrial revolution, known as Industry 4.0. The acceleration of technology has shaped every aspect of our lives and transformed our working environment, witnessing the introduction of smart automation, wearable tech, connectivity, artificial intelligence (AI) and robots. But just as we become familiar with 4.0, the metamorphosis of Industry 5.0 has begun. The term Industry 5.0 refers to the acceleration of integrated and shared workspaces alongside cobots (collaborative robots) and smart machines. It builds on Industry 4.0 with enhanced connectivity and cyber-physical working environments.
Technology, AI and connectivity will come together and bring new occupational safety and health (OSH) challenges and opportunities into our working lives. In this article, we explore the changing landscape for the OSH practitioner and pose the question – are we ready for these changes?
The rise of the cobot
As an OSH practitioner working in a water processing and bottling plant 20 years ago, I remember with excitement the introduction of ‘Robbie the Robot’, who was installed to lift heavy five-gallon water bottles and carefully and precisely place them into stillage cages. Robbie was impressive, housed within his fixed guarding, with various infrared and interlocking devices.
As a practitioner, my main objective was to keep moving parts away from humans, which typically involved physical and procedural safeguards. But as we release these robots from their cages, are the traditional methodologies and approaches used for safeguarding fit for purpose?
As innovative technologies emerge, we will start to explore the shift of shared spaces and the human-machine interface, and the boundaries between humans and technology will become more blurred. Today, although many cobots are still relatively ‘fixed’ to a location and can only undertake predetermined actions, some can engage in direct interaction with a human worker in a shared space.
However, new automated guided vehicles (AGVs) can navigate through shared spaces and new autonomous mobile robots (AMRs) can shut themselves down in the presence of humans, releasing the cobots from their anchored abode. Cobots can now intelligently analyse and decide the correct task to perform; they are becoming more autonomous with the freedom to move on their own (using dexterity, sensing and memory capacities); and increasingly are able to reflect and ‘learn’ by means of algorithms that allow them to correct their ‘own’ behaviours and errors.
Cobots are not just limited to assembly lines and packaging and palletising operations in manufacturing – a variety of industries now recognise the benefits of cobots for increased productivity, precision and return on investment. For example, within the construction industry, ‘Hadrian X’ – nicknamed for its wall building ability – is the world’s first mobile, robotic concrete block-laying machine.
Hadrian X can lay 200 concrete blocks an hour, while monitoring and adjusting to wind, vibration and other environmental factors. No pre-laid track is required to allow flexibility and Hadrian X can build directly from CAD drawings. Large-scale 3D printing, painting drones, welding robots, self-driven vehicles, demolition robots and the rise of wearable tech have also started to appear in the construction industry.
Wearable tech and human–robot collaboration
You might be familiar with everyday wearable tech, that is commonplace for monitoring our health, fitness and wellbeing through tracking devices such as smart watches. But wearable tech can also report OSH data live, such as compliance monitoring, warn the wearer of hazards such as nearby moving vehicles, take biomonitoring readings (such as heart rate), and monitor the wearer’s exposure to air quality, noise and heat.
For example, hard hats with integrated heat sensors can detect and warn of harmful solar rays and smart glasses can provide 3D mapping on a construction site to identify boundaries and hazard marking, alerting the wearer to the presence of dangers. Glove scanners enable workers to scan products on production lines or in warehouses hands-free, therefore reducing the risk of strain injury from repeated flexing of the hand when using a hand-held scanner.
In 1986, Ellen Ripley (played by Sigourney Weaver) in the film Aliens wore and used the Hollywood ‘Power Loader’ – the first robot exoskeleton to hit the silver screen. James Cameron termed the item as the ‘future forklift’ and he wasn’t far wrong. Today, exoskeletons are not used for fending off the xenomorph queen, but in a wide range of applications within industry.
In 2020, Hampshire care workers tested a Japanese cobot exoskeleton worn around their waists to assist with moving objects and supporting people. The cobot identified electrodes placed on the wearer’s skin that measure the person’s muscle movement and activity. It then converted this into a cobot movement to help the user with the physical effort of the moving or lifting task. The care home bosses identified that the task of caring for and physically supporting a resident with complex needs that may have previously required two carers working together can in some instances be delivered effectively by a single individual using cobots.
With the care sector facing staff shortages, the prediction we are all going to live longer and rising rates of obesity in the UK (and the resulting physical demands this will place on medical and care staff who need to lift and move heavy patients), cobots and cobot exoskeletons could provide major productivity and safety benefits for the health and care sectors. For example, the use of cobot exoskeletons to support workers during patient lifting and handling tasks could reduce the risk of musculoskeletal disorders among medical and care staff – a major cause of staff absence in the sector.
South Korean carmaker Hyundai has developed a new Vest EXoskeleton (VEX) that enables the wearer to lift heavier weights over longer distances. When worn by the person, the exoskeleton helps transfer the weight in the user’s arms away from their neck, back and shoulders to their core, and uses hydraulic, pneumatic and electric power to help balance the person’s movement. This enables the wearer to more easily lift and carry a heavy load.
Hyundai has also developed a chairless exoskeleton, a lower-body exoskeleton that supports the wearer when they go into a sitting position. This reduces the pressure on the spine and muscles from sitting for extended periods, therefore reducing the risk of lower back pain and general fatigue.
Typically, these examples are passive systems that support joint movement and increase human capacity, usually for carrying weights and moving loads. But in the medical profession, powered cobot exoskeleton systems are now being used in hospitals to support the rehabilitation of patients and these powered systems have a greater lifting and support capacity than passive exoskeletons.
For example, robotic powered exoskeletons have been used for the rehabilitation of people with weak or paralysed legs caused by stroke, spinal cord injury or other neurological conditions. The exoskeletons are placed over the legs to help with standing and walking, using battery-powered motors to drive the legs. As the user shifts their weight, sensors are activated that initiate steps. The exoskeleton helps people to re-learn step patterns and weight shifts, with the ultimate aim of helping them regain as much of their natural gait as possible.
Until now, the weight of the battery has slowed down the mobility of powered exosuits worn by people during mobile applications, but eventually powered exoskeletons will become more lightweight, cost-effective and versatile, furthering human–robot collaboration. In time, we are likely to see powered exosuits being more widely worn and used in industry, once the battery size and weight have been reduced.
As technology advances, so do AI and connectivity. The merging of these three disciplines will create a significant shift in the ability of cobots and the tasks they can perform. Currently, we think about robots in terms of AI and the input from humans. But we are likely to see cobots with AGI (artificial general intelligence) that are capable of understanding and learning intellectual tasks – and performing tasks independently without human direction.
An intellectual bridge between a digital world and humans
Web 3.0, metaverse, virtual reality (VR), augmented reality (AR) and immersive types of technology – originally developed for use in the military and gaming worlds – are growing to allow people to live an increasingly digital existence. There are approximately 160 companies building the virtual world, Web 3.0, that are currently all separate, but will eventually link together. Think of it as lots of separate rooms that will eventually all become one building with connecting doors.
Many tech companies, including Microsoft and Facebook, are already investing. Global spending on VR/AR, the metaverse’s foundation technologies, is expected to rise from $12 billion in 2020 to $72.8 billion in 2024. It is expected the metaverse will be a significant part of our everyday lives: Generation Z and Millennials are expected to spend close to five hours per day in the metaverse, but older generations are also expected to spend several hours each day in the metaverse.
The gaming industry is clearly at the forefront of the metaverse, so it is no surprise that we associate discussions of the metaverse with gaming. Although gaming remains a leading reason for accessing the metaverse (and using VR and AR), consumers are increasingly looking for entertainment and shopping in the virtual world. You can buy and sell digital items, and many brands like Nike are already trading virtual items, like trainers for avatars. One in five metaverse users has attended virtual live events such as concerts, film festivals and training/learning experiences, allowing digital interaction and networking with humans, as if they are in a face-to-face setting.
Since Covid, many employers have viewed hybrid working as a way of supporting workforce wellbeing. Some tech companies are building virtual offices in the metaverse to support limitless design and functionality for their employees, to allow their colleagues to build their own preferred virtual desks and environment. Although I feel a degree of scepticism, I remind myself it has been two-and-a-half years since I last worked in a physical office.
Decisions on the scale and extent of adoption, opportunity, the investment required and how the metaverse will shape our working lives will be driven by employees and consumers. As as a Generation X’er, I think it will be the younger generation of Gen Z and Millennials who will shape this evolution.
Currently, the interconnectivity within Web 3.0 is still in its infancy. The option to wear VR headsets for a 40-hour working week in the metaverse and see pixelated characters in a 2D world is not all that appealing. But eventually, the option to work in an immersive connection will create global workspaces.
The fourth generation of the Internet, known as Web 4.0, is a term used to describe the new phase of the web where users can interact more seamlessly. Instead of individual users working in isolation, we will shift towards users working together. Another critical concept of Web 4.0 is the ‘Internet of Things’ (technology that can connect and exchange data with other devices and systems over the internet), and the number of devices connected to the internet is growing.
Meanwhile, the new emerging ‘Internet of Behaviours’ (IoB) can understand and predict human behaviour through data collection and analysis.
For example, advertisers are intelligently tracking our internet activities and providing ads that match our interests; and entertainment providers, such as Spotify, can predict our likes and suggest personalised playlists. This technology is already transferring into industry, using algorithmically (or computer) generated content.
For instance, instead of e-learning course designers developing content, or trainers creating pre-set tasks/assignments, computers can generate content or tasks based on students’ performance and progress. Computers are fed with knowledge about the subject matter and given instructions on how to combine it to generate new material. Known as Adaptive Learning Algorithms, this will also enable students to be the architects of their own learning and delivery methods.
Predictive modelling and data mining for analysing current and historical facts can be used to make predictions about future events. For example, when using live data sensors to measure the carbon dioxide (CO2) levels a worker is exposed to, would an increase in CO2 readings result in less cognitive functioning, leading to them making a mistake? We know academically that 1,000 ppm (parts per million) of carbon dioxide in the atmosphere will reduce the ability to concentrate by approximately 30 per cent. This data could be vital if an individual is making critical decisions.
Consider applying these algorithms and AI to the physical technology of machines. As we look into the future, cobots will be able to learn tasks and activities in an autonomous state (and retrain themselves on the task as it evolves), making intelligent decisions based on learnt algorithms. The knowledge of cobots will constantly evolve, and they will be able to move freely in a shared workspace with humans.
The UK’s Health and Safety Executive (HSE) has published research on the health and safety risks of human and robot interaction, but there is no specific guidance on managing the risks. It would be difficult to explicitly define, as technology is moving at such a fast pace. UK health and safety law does require that employers take any reasonably practicable measures that will keep their employees safe at work. This means following the typical approach of risk assessments, safe operating procedures, training and information and suitable processes for breakdown, maintenance and cleaning.
When considering the use of use of cobots, we may also need to consider power, force and speed limits, safe failure systems, emergency stop systems, ergonomic design and, of course, the usual physical (collision, substances, noise, heat, radiation), environmental and ergonomic hazards. Also, engineering and human errors (such as faulty electronics), or errors in programming and interfacing peripheral equipment could cause injuries to employees working with and around cobots.
There are numerous literature studies that highlight the gaps in uncertainty in existing safety assessment methods due to the complexity and versatility of cobots. Methodological research is still primarily focused on traditional risk assessment and physical safety, which are major issues for cobot producers and users.
Technology will also lead to new workspaces and new ways of working, which may generate new risks. For example, it is well recognised that the increase in remote working has amplified feelings of isolation and mental health concerns among individuals. OSH practitioners have played a prominent role in promoting the dialogue for wellbeing in organisations. But could the cobots of the future also be active collaborators which also promote workers’ mental health and wellbeing?
This does raise a question about the use of cobots in industry and the need to upskill OSH practitioners so they can understand the health and safety (and indeed psychological) implications. What are the challenges and risks around the interface between cobots and humans? And are we prepared for the pace of technological advancement?
Although cobots currently represent only three per cent of all industrial robots sold, they are projected to account for 34 per cent of the industrial robots sold by 2025, a market that itself is set to triple in size. Therefore, the skills of future workers will change. This means not just wider use of IT and engineering skills, but a growing demand for coding and data science, as new technologies like cobots emerge.
Therefore, as Industry 5.0 and Web 4.0 collide, resulting in rapid and constant transformation of digital infrastructures, we must not only consider the economic and social changes, but also the OSH aspects. There will have to be proactive risk assessments at the design stage of early innovations and early conversations between technology and digital developers and OSH professionals so we can understand the health and safety risks and other challenges that workers may face from these new technological and digital developments, and decide how best to manage them.
The interactions between cobotics, AI and connectivity could be one of the most transformative factors in the modern world. We’re on the cusp of the next era of human–machine partnerships. As future knowledge and technology development accelerate and enter into our workspace, a platform for horizon scanning the likely associated OSH risks – identifying the potential mitigation and risk controls of the future and recognising how innovations can support the OSH agenda – will be critical for drawing up proactive and effective preventive and protective measures.
The future poses an exciting opportunity for OSH practitioners to embrace the transformation, leverage technology and reshape the scope of OSH through the Industry 5.0 revolution.
But before you finish reading this article, Industry 6.0 is already beginning to be defined, driven by the global economic crisis and the impact on supply chains and suppliers. A resilience-building Industry 6.0 to reframe customer-centric, hyperconnected and dynamic delivery is being discussed to provide sustainable solutions and growth to the UK and the global economy. But as we step back to the present Industry 4.0, the future will be upon us with lightning speed and the horizon of 6.0 is within sight.
OSH practitioners have an important role to play within our vibrant future of leading the global OSH conversation while supporting technology progression. Due to the speed and scope of tech growth, as a profession, we should be investing in research, peer-sharing knowledge and upskilling our profession to contribute to a world-leading transformation and vision.
Dr Julie Riggs will be speaking at the SHW Live Exhibition in Manchester on 14–15 February on ‘The changing landscape of OSH in a digital world, are we ready?’ and ‘Creating a suitable safety culture’. See: safetyhealthwellbeing.live
Dr Julie Riggs is senior head of education at the British Safety Council
Contact Dr Julie Riggs: [email protected]
By Emma Slaven, Acas on 03 October 2023
Have you ever told a colleague or manager that you feel stressed? Has a colleague ever told you that they feel stressed? If so, how do you feel it was handled?
By Nicola Hodkinson, Seddon Construction on 02 October 2023
Male construction workers in the UK are three times more likely to die by suicide than the national average, but creating an environment where workers feel comfortable talking about mental health and better project planning at the pre-construction stage to reduce time pressures on the workforce can help protect workers’ mental wellbeing.
By Dr Audrey Fleming, British Safety Council on 02 October 2023
A significant percentage of employees feel that their employer’s efforts to support their wellbeing are either misdirected or little more than a ‘tick-box’ exercise, so it’s vital to take a strategic approach to worker wellbeing, starting with asking staff exactly why, where and how they need support.