Blog

The Ethics of Automation in Engineering: 5 Issues to Consider

on July 8, 2025 in Engineering Clients

 

Author: Robert Shearer

Modern engineering is transforming—and fast. Recent advancements in robotics, AI, Industrial Internet of Things (IIoT), drones, advanced software, predictive analytics, simulations, and digital twins are pushing modern engineers to become hybrid experts: not only in hardware, but in the systems used to automate, streamline, and accelerate development cycles.

But every transformation has both pros and cons. Sure, a 50% reduction in downtime and 20% lift in OEE (Schneider Electric) is great, but is it worth displacing jobs, losing human skills, and compromising on safety and privacy? Before going “full steam ahead,” consider the ethical shortfalls of automation and how to guard against them.

The rise of automation in engineering: key trends

Control Engineering’s State of Industrial Automation 2025 Report states that 60% of engineers use automation systems today. This widespread adoption is causing fundamental changes across a number of aspects of engineering, including the following.

Software-hardware convergence

Most engineering systems now include automated components, requiring expertise in the programming languages and software tools that operate them. For example, modern PLC, HMI, and SCADA tools leverage graphical programming environments, simulation tools, and libraries for rapid development and testing. These features enable engineers to design, validate, and deploy systems faster, but they also demand a diverse array of skill sets on top of traditional core engineering skills.

Robot-human collaboration

In industrial robotics, the number of proprietary robot programming languages (ABB RAPID, Fanuc TP, KUKA KRL, Yaskawa INFORM) has grown significantly. What’s more, robots now dynamically leverage sensors, safety zones, and autonomous operating systems to more effectively collaborate with and augment human engineer capabilities. While some systems are fully automating human labor, in most cases the result is a collaborative, rather than a competitive model.

Self-optimization and operational efficiency

Mechatronics and motion control have resulted in a number of smarter, more connected and adaptive systems. Plug-and-produce integration, edge computing, Industrial Internet of Things (IIoT), real-time machine-level processing, and simulations all work to improve efficiencies. For example, modern servo systems, linear actuators, and stepper motors use built-in, real-time feedback mechanisms (often powered by AI) to self-optimize for more precise positioning and adaptive motioning, reducing integration time and increasing flexibility.

Scalable workflows

Emerging tech like IIoT, edge computing, RPA, MES/MOM, digital twins, and digital manufacturing tools don’t operate in a vacuum. Modern engineers must not only learn how to master these tools, but also how to integrate them into engineering workflows while maintaining project deadlines, safety, documentation, and scalable production.

5 ethical issues to consider around automation in engineering

These transformations in engineering are creating new value and benefits, but they also raise new ethical considerations. Here are some of the most relevant issues to consider.

1. Job displacement

As machines take on more jobs traditionally assumed by human workers, accelerate development cycles, and become better at self-optimization, the demand for human workers decreases. Not only will this cause financial hardship and reduced self-esteem for those displaced, it could also disincentivize people from entering the field in the future.

However, it’s important to temper this concern. Right now, 77% of employers struggle to hire qualified engineers. While, yes, automation will eat into some demand for human labor, the depressed supply means that widespread displacement or replacement is unlikely, at least for now.

2. Social responsibility and fair transition

Responsible organizations should consider their own ethical obligations to support workers affected by automation. This includes retraining, upskilling, transition assistance, or other programs to transition worker value, not eliminate it.

One example is to pivot workers from human-only tasks to collaboration with machines. It will be a while until machines can operate fully autonomously (if ever), which means workers who can keep up with these evolving systems can continue offering value amid continuous change.

3. Skill loss and brain drain

Automation fatigue is a real challenge in all sectors, including engineering. When humans become over reliant on automation to make decisions, it can not only lead to safety and security concerns (more on that below), but can contribute to skill loss and brain drain.

It’s basically “use it or lose it,” but applied to creativity, problem solving, and engineering judgement. If engineers lose critical experience with manual calibration, system behavior analysis, and mechanical intuition, your organization’s institutional knowledge will start to deplete.

4. Safety and compliance

Faster isn’t always better. Yes, accelerated development cycles can be of significant value to businesses, but not if they come at the expense of compliance with safety standards, or maintaining clear accountability for failures.

It’s important to maintain rigorous testing, validation, and safety verification processes, even as development and deployment speed up. Additionally, as systems become more complex and autonomous, technical documentation will be even more critical to maintain compliance, troubleshoot issues, and determine accountability for mistakes.

5. Privacy, surveillance, and control

Automated systems, especially those powered by AI and IIoT, collect vast amounts of data. Collaborative systems continuously monitor and analyze human performance, creating unprecedented capabilities in workplace surveillance.

This presents a twofold risk. First, there’s the question of “what should we surveill”? Monitoring every aspect of a worker’s day can lead to a tense relationship between employees and leadership—no one wants Big Brother to be watching all the time. Second, the onslaught of data means that everyone is drowning in information, meaning key insights (e.g. small issues that could balloon into catastrophic failures) could get lost in all the noise.

Final thoughts on the ethics of automation in engineering

Engineering companies stand to benefit from the rise of automation and AI in engineering: faster production, less downtime, and more efficient risk mitigation. But the impact on human employees, unclear accountability, and safety risks should give leaders pause before going “all in” on these tools.

Building a comprehensive talent strategy in 2025 and beyond means strategically evaluating your needs and figuring out which skills are better done by humans, and which ones by AI and automated tools—and which require collaboration on both sides.

That type of assessment is easier when you have a strategic talent partner who offers tailored integration programs based on your specific needs, and smart workforce planning driven by both internal and external analytics.

If you want an efficient—and ethical—approach to automation, schedule an intro call with one of PEAK’s staffing specialists today.