Digital Twin and Simulation: Replicating Industrial Systems to Enable Improvement

Digital models, digital shadows, and digital twins help transform and optimize industrial and business operations.

Digital twins are currently being used in several industries and organizations to design and operate complex products and processes. This technology adoption is massively accelerating process development and optimizing operations, and its success comes from enabling enhanced decision-making. In terms of business outcomes, the main results companies pursue through digital twins are increased efficiency, reduced costs, better product design, and boosted innovation.

Before we get to the applications, let’s look at some definitions. The International Organization for Standardization (ISO) has developed a digital twin framework for manufacturing where the concept of digital twins is defined as: “A fit-for-purpose digital representation of an observable manufacturing element with a means to enable convergence between the element and its digital representation at an appropriate rate of synchronization.”

This means that a digital twin is a virtual replica of a physical component, product, system, or process within a manufacturing setting. Its function is to reflect the state or performance of the physical object in real-time—not to enable further capabilities, such as simulation, orchestration, or prediction—and ultimately support operational decision-making.

The digital twin concept comprises three main elements: the physical product, the digital product, and data connecting the two. Depending on the level of data integration, digital twins can be categorized into three subcategories (figure 1):

  • A digital model is a digital representation of a physical object without any automated data exchange between the physical object and the digital object. This means that a change in the physical object is not reflected automatically into the digital object and vice versa.
  • A digital shadow is a digital representation where an automated one-way data exchange exists, which means that a change in the physical object is reflected automatically into the digital object, but not vice versa.
  • A digital twin is a digital representation where data is fully integrated and flows automatically in both directions between the physical object and the digital object.
Figure 1. Each type of digital twin involves a different relationship between the physical and digital worlds, and different data flows. Source: Jeff Winter

A digital twin should not be thought of as a technology but as a composition of different technologies that develop a way of linking the physical and digital worlds. Digital twins can span the entire product lifecycle, from design through simulation, manufacturing, assembly, service, and support.

Because digital twins are designed to model complex assets or processes that interact with several components, environments, and unpredictable variables, manufacturers need to have a range of capabilities already in place to deploy them. These include computer-aided design modeling, connectivity, cloud computing, the Industrial Internet of Things (IIoT), a variety of different software platforms, augmented reality (AR) and virtual reality (VR) hardware, artificial intelligence (AI), and machine learning (ML), and systems integration.

The challenges associated with digital twin development include data growth, cybersecurity, the extent of digital skills required, and change management. Despite the challenges, the digital twin is quickly becoming a relevant technology, and manufacturers should start thinking of piloting new projects.

Why digital twins are important now

During the past decade, digital twin capabilities have been evolving rapidly because of a series of technology enablers and drivers:

  • Access to larger volumes of data and machine learning makes it possible to create more detailed simulations with enhanced depth and usefulness of insights.
  • Better asset monitoring technologies and new sources of data enable continuous real-time simulations.
  • Enhanced industry standards for communications between sensors and operational hardware and diverse platforms have improved interoperability.
  • Better data visualization technology (e.g., 3D, VR, AR, AI-enabled visualizations) can handle greater volumes of data.
  • Instrumentation is becoming smaller, more accurate, more powerful, and cheaper.
  • Compute power, network, and storage are becoming more powerful and cheaper.

It is worth mentioning that a few other key enabling technologies are becoming cost-effective and are being adopted: AI, ML, IoT, high-performance computing, cloud computing, and more are what allow digital twins to be so powerful today as opposed to five years ago. Technology companies are significantly investing in improving these digital enablers. Some of these investments are supporting the development of specific digital twin use cases.

Technology vendors have been shifting their attention to developing strong digital twin offerings. In the past few years, several—including IBM, Oracle, SAP, Microsoft, Amazon AWS, Rockwell, Siemens, and GE—have developed and launched digital twin offerings. Some also have made acquisitions to strengthen and build advanced digital twin capabilities.

Digital twin trends and smart factories

Digital twins are being used alongside other Industry 4.0 and smart factory applications in a variety of industries. A growing number of organizations in asset-heavy industries, such as aerospace, automotive, industrial products, and oil and gas, are implementing digital twins to transform their operations. Nonheavy and nonmanufacturing industries in sectors such as consumer goods, retail operations, facility management, health care, and smart cities also are piloting and starting to adopt digital twins.

Other companies are increasing the scale of digital twin deployment because it provides real business value: It helps companies transform and future-proof their businesses to deal with uncertainties and stringent competition. According to Deloitte, digital twin technology can deliver specific business value in the following areas: 

  • improve quality of products and processes and help predict and detect quality defects quicker
  • improve customer service by enabling a better understanding of equipment and determining warranty costs and claim issues more accurately
  • reduce operating costs by improving product design, equipment performance, and by streamlining operations, and reducing process variability
  • create record retention of serialized parts and raw materials to support tracking and quality investigation
  • reduce time to market and cost to produce a new product by reducing lead times of components and optimizing supply chain performance
  • create new revenue growth opportunities by helping to identify new products and improving efficiency and cost to service.

To help understand where digital twins can be used within a smart factory, a framework created by IoT Analytics breaks down the cases and capabilities, using three dominant dimensions: the hierarchical level of a digital twin (six levels), the lifecycle phase in which the digital twin is applied (six levels), and the use or capability of the digital twin (seven levels). An additional fourth dimension can be added that specifies the data type used by the digital twin; this can be real-time, historical, or test data (figure 2). Therefore, according to this framework, there are at least 250 combinations or types of digital twins (6×6×7=252).

Figure 2. Digital twin classification framework. Source: IoT Analytics research
Figure 2. Digital twin classification framework. Source: IoT Analytics research

One of the best examples of how to implement digital twins in manufacturing to support smart factory initiatives is Unilever. In 2019, Gartner named the consumer goods giant one of the industry’s best-performing supply chain leaders.

Unilever implemented digital twins of its manufacturing production line process to increase productivity, reduce waste, and make better operational decisions. Its digital twins are a type described by the IoT Analytics framework as process × operate × orchestrate. 

According to a 2019 article in the Wall Street Journal, devices send real-time information on physical variables, such as temperatures and motor speeds, into the cloud. Advanced analytics process this data and simulate conditions to ultimately map out the best operational conditions to control and adjust the production process. This results in better quality and productivity. Unilever worked with Microsoft to implement digital twins in dozens of its 300 global plants, and each twin reportedly was implemented in three or four weeks.

Another interesting example is digital twins in the field of maintenance prediction. Using the digital twin, a company can develop predictive maintenance strategies based on the digital replica of a machine or group of machines. 

With this technology, maintenance specialists can simulate future operations of the machine, create failure profiles, calculate the remaining useful life of the machine, and plan maintenance activities based on the simulation results. All of this happens without the machine being stopped. The digital twin collects machine data from the machine controller and external sensors; this data is fed into a simulation model that uses algorithms and data analysis technologies to predict the health status of the asset. According to the IoT Analytics framework, this type of digital twin is a product × maintain × predict digital twin.

Digital models, digital shadows, and digital twins are helping transform and optimize industrial and business operations today. Manufacturers that have not started down this path should begin thinking of pilot projects and what is needed to implement the technology.

Authors: Juan-Pablo Zeballos Raczy is chair of the ISA Digital Twin and Simulation committee within the SMIIoT Division. The committee provides expertise and guidance to implement digital twins as part of smart manufacturing programs. Zeballos Raczy is a senior consultant specializing in the digital transformation of manufacturing and supply chain operations. Previously, he worked for eight years in world-class consumer goods and mining organizations in Latin America. He holds a degree in electronic engineering and an MBA.

(Courtesy of ISA InTech)

Sustainable Automation

Introduction

Regarding what businesses should focus on, the Nobel Prize-winning economist Milton Friedman once opined that:

“There is one and only one social responsibility of business–to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition without deception or fraud.”

This quote dates to an essay published back in 1970, and much like everything else, things have changed since then. Companies have increasingly come around to the notion that this sort of thinking can work well in the short term, but to truly succeed in the long-term, companies must work to meet the needs of a diverse group of stakeholders* rather than a more exclusive group of shareholders. This change has seen a constant evolution in corporate initiatives over the years to meet this new focus. Today, the drive to increase stakeholder value focuses on three major areas: Environment, Social, and Governance (often referred to as ESG together). As it has always done in applications where efficiency, prosperity, and transparency are involved, automation plays a central role in this new philosophy.

Automation in Environmental

Out of the three pillars of ESG, the environmental pillar is probably the one that is most closely related to automation for the simple fact that more efficient processes make more efficient use of materials, nicely fulfilling one part of the reduce, reuse, recycle triangle. Today’s automation, with its advanced instrumentation and controls, allows for closer control over processes, reducing the amount of material and energy needed to produce products. Further, as automation continues to improve, so too does quality, which allows for less material wasted on poor products and less energy wasted on rework. These two qualities are advantages to both the environment as a whole and the organization itself as less waste means fewer needless costs. Lower costs allow for more capital through higher margins or the ability to be more competitive with pricing. And yes, while larger profits can be funneled to shareholders, those organizations looking to adopt the ESG mantra are typically those looking at longer-term success rather than shorter-term windfalls.

Side two of the triangle, reuse, has been a core concept of the automation world since its inception, as most components are refurbished or repaired rather than discarded. In fact, this core trait of the industry has caused friction moving forward as organizations face the challenge of working to interface larger, older systems with state-of-the-art centralized control systems. This is particularly apparent in the paper and power industries where producing equipment such as paper machines and reactors can be decades old, in many cases older than many of the people working on them. Even on a smaller scale, companies are working to refurbish smaller equipment rather than outright replacing these systems, reducing costs for them and the environment.

Finally, with recycling, more precise measurements and better automation have helped companies better understand the cost savings and the tools to implement recycling programs. One excellent example of this is the work being done to reduce water usage in the semiconductor industry using recycling, thanks to new water treatment processes. As with other parts, the environmental aspect of ESG has a positive impact on the environment with less water being used and disposed of and on the bottom line of the company with lower utility costs. In some cases, this initiative literally saves the operation, with water scarcity issues driving companies to improve or abandon operations in certain areas.

Automation in Social

The social pillar of ESG focuses on the people, directly and indirectly, involved with organizations, and automation plays a key role in two important areas: The safety of those working in the organization, and the safety of those around it.

Safety standards have been on the rise over the years as companies realize the direct costs (e.g., insurance premiums) and the indirect costs (e.g., poor publicity) of ignoring employee safety. Automation plays a big part here, especially in higher-risk jobs that involve applications that are dirty, dull, or dangerous. In these three applications, automation may help to reduce the potential risk of injury to personnel. For example, advanced sensors may help to reduce risk by determining whether an application is potentially unsafe, or may be able to remotely isolate a hazardous process from personnel.

While companies have historically been good at focusing inwards, the social part of ESG forces companies to look out at the communities they are part of to ensure, at minimum, that they are not having a negative impact on the world around them. In some cases, this harm reduction can be closely related to environmental goals, such as the cleanup of the Cuyahoga River, which would randomly catch on fire.

However, in many cases, this can be the literal protection of lives in the areas around the production site, as was seen in the infamous Bhopal Gas Tragedy. In these cases, using automation to tightly monitor and control the use of products in production can ensure that those communities in proximity to the company are unharmed by the operation. This type of protection takes many forms, such as advanced continuous emission monitoring systems (CEMS), spectral gas monitoring for wide area applications, and ever-evolving water treatment systems to ensure that hazardous material doesn’t escape the site.

Automation in Governance

Much of the theme of governance regarding ESG centers around accountability and transparency, both of which are addressed through automation. Automation at levels 1 and 2 can ensure compliance with environmental and safety standards, and newer, more advanced monitoring solutions mean that standards can be better defined, and compliance can be better assured.

Analytical measurements are a great example of this. Lab tests that used to rely on manual titration and colorimetry are now being replaced or enhanced with spectral and electrochemical measurements, many of which are now being done directly in the process. Where once a process was validated by a liquid turning pink, it is now monitored at the part-per-billion level within a percentage point of error. This granularity is being used by companies and regulators to better monitor applications for compliance.

At levels 3 and 4, we can see the impact of Industry 4.0 to connect the organization from top to bottom. This new age of information allows companies to directly monitor applications no matter how big or small and to ensure that key processes and policies are being followed. While key performance indicators (KPIs) have always been centrally developed, there have always been many layers between those that design them and those that implement them. With these layers comes the potential for data to be manipulated or at least delayed to the point of irrelevance. The promise of Industry 4.0 is the ability to ensure complete transparency within the organization. Data is no longer downloaded and transcribed to the point of use, it is now selected and streamed by those that need it. While this may not eliminate the risk of data manipulation, it does help to reduce it. In addition, more advanced analytics engines such as machine learning (ML) and artificial intelligence (AI) will help to identify potential areas of concern.

One area where governance and automation have significant synergy is the ability of automation to produce data that focuses on the skill of the employee rather than their identity. As companies have increased their strategic efforts on diversity, equity, and inclusion, the type of properly collected and curated data that may be provided by certain automation systems and processes can help keep the focus on employees’ performance and skills.

Automation and Sustainability are Key Pillars for New Business Growth

Sustainability and automation are tightly interlinked for new business growth. Over the past decade, many corporations pledged their commitment to sustainability and sustainable practices, with some making further commitments towards carbon neutrality and net zero. Shareholders and boards are demanding sustainability and social responsibility as key corporate values in addition to profitability from CEOs. Investors poured $120 billion into sustainability in 2021 (2x that of 2020), and analysts expect the levels to reach trillions in the next 2 decades. Gartner estimates that automation could result in a $15 trillion benefit to the global economy by 2030, and further notes that automation can help fight inflation by reducing costs and driving new revenue streams and job creation.

Summary

Automation has a key role to play in creating and executing a successful ESG strategy. The ability of automation to provide transparency and accountability ensures that policies enacted are followed, and its ability to safely and effectively manage processes ensures that companies can have a positive impact on their surroundings. The advancements provided by Industry 4.0 will allow companies to further extend their ESG goals, helping them enact stricter environmental and efficiency policies, and advanced manufacturing concepts like the circular economy.

While much of the focus is on technology, the people at the core of automation cannot be forgotten. Engineers, technicians, managers, and executives will need new tools to take full advantage of what’s to come. Standards and technical resources from ISA can help those involved in automation properly manage the sheer amount of information they face, and new training programs can help provide the skillsets that will be needed to use automation to achieve sustainability goals within an organization. Finally, companies that focus their attention on using automation and sustainability will grow financially through ESG leadership. ISA could provide guidance for those in the industry that are pursuing sustainable automation.

*Stakeholders typically include internal and external sources such as employees, investors, suppliers, communities, owners, government, etc. This list will vary depending on the company, but essentially includes all those affected by the company and its operations.

About the Authors

1646104551619

Ryan Kershaw is a Senior Member of ISA and holds a Certified Automation Professional designation. Ryan works with Litmus Automation and is part of the Smart Manufacturing and IIoT division within ISA where he works with the Industry Maturity and Readiness Committee. Ryan lives just outside of Toronto, Canada with his wife, three kids, and his dog, and much like many Canadians, uses his love of hockey to get through the winters. Connect with him on ISA Connect.

1656713104301

Prabhu Soundarrajan is an executive board member at ISA and has 20 years of experience in Automation and ESG. He was recently elected as 2024 ISA president and has served on ISA’s executive board since 2017, holding several leadership roles in ISA as Vice President of Industry and Sciences. Prabhu is Vice-President of Innovation at Republic Services, a Fortune 300 company driving Sustainability in action. He loves Golden State Warriors basketball and golf. Connect with him on ISA Connect.

(Courtesy of ISA)

IDM Can Improve Plant Reliability and Reduce Downtime

There are hundreds of smart instruments including sensors and final control elements installed in medium-and large-size industrial plants which are used for control, optimization, or process monitoring. Smart instruments are microprocessor-based devices that are equipped with additional digital processing facilities to provide extra functionality such as compensation, self‑checking, diagnostics, and even perform control functions within the device. These functionalities, especially the diagnostics feature, can be leveraged to improve instrumentation maintenance and reduce unplanned downtime.

Intelligent device management (IDM) is a set of tools that can be installed on process control networks (PCNs) or on a plant’s business network to collect smart instruments diagnostics info. It can be used to improve operations and maintenance plans. However, identifying and mapping diagnostics info from various instrument models and vendors can be challenging and would require a lot of engineering hours; NAMUR NE-107 can be the solution.

NAMUR is an international user association of automation technology and digitalization in the process industries. It has several working areas with numerous working groups in each area and has released numerous guidelines under each working area. NAMUR NE-107 is one of the guidelines and its focus is on “self-monitoring and diagnosis of field devices.”

Per NAMUR NE-107, instrument health diagnostics are collected under four main categories called the NAMUR status: FailureCheck FunctionOut of Specification, and Maintenance Required.

Figure 1: NAMUR NE-107 Status

For example, all diagnostic info of a control valve now can be collected under the four NAMUR categories.

Intelligent Device Management collects smart instrument diagnostics and provides plant operators and reliability teams with instrument health dashboards based on actual info collected from field devices. The instrument health feature enables plant maintenance managers and technicians to see the overall plant sensor health picture and drill down to individual sensor alerts, error codes, and potential remedies based on NAMUR NE-107.

As mentioned, the IDM can be installed on the process control network or on monitoring and optimization (M+O) layers as per NAMUR Open Architecture (NOA). The NOA offers a framework to implement innovative solutions both for new (Greenfield) as well as existing (Brownfield) plants. The main advantage of the NOA approach is that the process control core remains largely unaffected, which makes NOA especially appealing for Industry 4.0 innovations in brownfield plants.

Figure 2: NAMUR Open Architecture (NOA)

Smart instrument integration into the IDM can be done in a few ways, either by direct connection through WirelessHART, or connection over OPC unified architecture (UA). On the communication side, there are various field device communication protocols, the most popular being HART, PROFIBUS, and FOUNDATION Fieldbus. A smart instrument can provide multiple process variables as well as plenty of diagnostic info regardless of the communication protocol. This info can be utilized to improve production, maintenance, and reliability.

IDM can help operations to reduce unplanned downtimes by providing instrumentation health notifications. With the development of field device technologies as well as NOA frameworks, integration of field devices into NOA M+O layers has become easier than ever.

If you have any questions or comments, please reach out through my LinkedIn account: Bakhtiar Pourahmad P.Eng, PMP | LinkedIn

References

NAMUR NE-107
NAMUR Open Architecture (NOA)

Bakhtiar Pour Ahmad

Bakhtiar is a Senior Process Automation Engineer with Freeport McMoRan Technical Services Team. He is a professional Engineer (P.Eng.) and certified Project Manager (PMP) with over 22 years of experience in various industries such as Oil & Gas, Power Generation, and Mining and Metals.

(Courtesy of ISA & Bakhtiar)

Robots Give Humans the Opportunity to Move Up, Not Out

Robots Give Humans the Opportunity to Move Up, Not Out
Robots Give Humans the Opportunity to Move Up, Not Out

It’s the subject of countless science fiction stories and panicked economic headlines. Are robots going to take all of the jobs?
 
While the narrative that automation is replacing humans makes a compelling dystopian film, the reality, as usual, is a lot more complicated. And for the labor force, the news is mostly good.
 
Robots in factories don’t look anything like the Terminator. Instead, they are formed to accomplish very specific tasks like assembly, packaging, grinding, polishing, and loading. The mundane and sometimes hazardous work targeted for automation is typically not ideal for humans to perform. It is true that some jobs are lost in this equation, but jobs are created, as well. To get the full picture, you have to look at both sides and calculate the net outcome.
 
With that in mind, here are four truths about automation and the manufacturing industry that showcase how robots and humans can work side by side––to the advantage of everyone.

1. The manufacturing industry is actually growing.

It is true that US manufacturing jobs have been steadily declining since the 1980s. However, the supply chain disruptions that came with the COVID-19 pandemic changed the way a lot of companies thought about manufacturing. It became starkly clear that a model completely reliant on overseas production is more vulnerable than a business with domestic manufacturing ties.
 
As a result, close to 350,000 manufacturing jobs were created in 2021 in the US, and 10s of billions of corporate dollars have been earmarked for investment in factories that produce everything from semiconductors to solar panels.
 
So, while robotics have no doubt changed the face of manufacturing, the state of the industry is strong, and jobs are following.

2. There are currently hundreds of thousands of manufacturing job openings.

Automation has increased efficiency and decreased the number of line workers, but it has not eliminated the need for human workers altogether. Far from it––last year, there were more than 800,000 unfilled manufacturing jobs here in the US. That number is expected to grow in the coming years, with more than 2 million unfilled jobs open to job seekers.
 
As of 2020, there were 2.7 million industrial robots in operation, while there were 15.6 million human employees. There are many jobs that robots simply cannot fill, and factories will always be looking for talented and motivated workers to supervise machines and make the crucial decisions that keep everything running smoothly.

3. Robotics improve safety on the line.

Many of the jobs lost to robots on a manufacturing line are dangerous for people to perform.
 
Unsafe conditions might be obvious––using robots to handle unstable chemicals or radioactive materials is a no-brainer. But there are other dangers that not everyone thinks about. According to OSHA, heavy lifting is one of the leading causes of workplace injuries. More than a third of reported injuries were in the neck and shoulder. These types of injuries can lead to lifelong complications.
 
Robots, on the other hand, can lift up to 3,000 pounds without consequences. They can repeat the same motion without damaging irreplaceable muscles or joints, unlike humans. While not everyone views robotics as a positive for the future, 85% of Americans agree that robots taking over risky jobs is a good idea.

4. Reskilling is easier than ever.

Gone are the days when shifting your career focus meant returning to school or a lengthy certification process. The rise of remote work showed that, in the case of workplace learning, eLearning is an invaluable tool that provides an efficient and cost-effective way to teach workers new skills.
 
The workplace is changing everywhere, not just in the manufacturing sector. Effective training opportunities benefit workers by giving them invaluable skills that lead to higher-paying positions and long-term growth. The same training helps companies retain employees, reduce turnover costs and avoid unfilled positions.
 
Automation is changing the way factories work, but there is a huge need for skilled laborers in every area of manufacturing. As unsafe and tedious jobs are filled by robots, companies can expand, retrain and hire workers for higher-wage positions as mechanics, supervisors, and innovators. The people who know the industry best are the ones already inside it, and there is a massive opportunity for manufacturers to utilize the knowledge of line workers to make improvements that lead to future growth.

About The Author

Jorda Erskine has almost 20 years in the beauty/skincare industry. Jordan currently serves as Co-Founder & Principal for the award-winning contract manufacturer Dynamic Blending. He has spent his entire career in the manufacturing and R&D world. Jordan started his career at a large contract manufacturing facility, Wasatch Product Development, in Draper, Utah. There he wore many hats in R&D and manufacturing. In 2015, Jordan decided to start Dynamic Blending with Gavin Collier due to the huge need for innovation in a stale industry, contract manufacturing.    
 
Dynamic Blending Specialists is a full-service turnkey contract manufacturer of cosmetics, personal care, skin care, dietary supplements, nutraceuticals, and much more. Our team consists of industry experts with an extensive background in cosmetic chemistry (R&D), formulations, manufacturing, packaging, shipping, and quality.

(Courtesy of ISA)

Nord Drivesystems’ LogiDrive System Provides Optimized Solution Ideal for Intralogistics and Airports

Nord Drivesystems’ LogiDrive System Provides Optimized Solution Ideal for Intralogistics and Airports

The DuoDrive integrated gear unit and motor combine with the total LogiDrive package to form a high-efficiency solution capable of high-power density, quiet operation, and simple Plug-&-Play commissioning.

NORD’s LogiDrive complete drive solution reduces planning and commissioning efforts by offering an energy-efficient, standardized, and service-friendly system that is Industry 4.0 Ready! Permanent Magnet Synchronous Motor (PMSM) technology enables the LogiDrive system to maintain high efficiency even in partial load ranges and low speeds–making the solution especially suited for intralogistics, warehousing, and airport applications.

The LogiDrive package consists of:

  • High-efficiency two-stage bevel gearbox or DuoDrive
  • IE4 or IE5+ permanent magnet synchronous motor
  • Decentralized variable frequency drive
  • Power plug connector
  • M12 connectors
  • Incremental encoder
  • Pre-assembled cables
  • Standardized hollow shaft diameters

This solution reduces system variants through standardized geared motor selections tailored specifically to application needs and a large operable speed range via variable frequency drive technology. Simplifying engineering and selection into a compact, modular design significantly reduces spare parts inventory, enables fast commissioning through Plug-and-Play technology, and allows the replacement of individual components. The plug-in connections on the base product also enable easy maintenance, service, and installation.

When it comes to gearbox options for the LogiDrive package, two-stage helical bevel gear units or the new DuoDrive integrated gear unit and motor are available. Two-stage helical bevel gear units are made from high-strength aluminum alloy and feature an open housing option for better heat dissipation for high axial and radial loads. They excel in conveying and processing applications while providing a more efficient and reliable solution than typical worm units. The DuoDrive integrated gear unit and motor feature a compact, UNICASETM housing and deliver an extremely high gear efficiency of up to 92%. These drives also feature high power density, quiet operation, and fewer wear parts for low maintenance and long service life.

NORD’s IE4 and IE5+ synchronous motors provide some of the highest efficiencies currently available. The use of this technology in the LogiDrive system minimizes overall costs during service life, provides a faster return on investment, and maximizes system availability. When these motors are paired with the NORDAC LINK VFD, high precision regulation and increased system accuracy is achieved. This optimized combination also results in large overload capacities capable of constant torque over a wide speed range.

NORDAC LINK variable frequency drives offer quick installation and servicing thanks to their quick-disconnect cable options, integrated maintenance switch, and local manual control options. These decentralized VFDs feature functional safety options, an internal braking resistor for controlled, dynamic braking, and parametrization via plug-in control modules, NORDCON software, or NORDCON app. As part of the complete LogiDrive package, NORDAC LINK supports a large speed range–enabling automation for a variety of applications such as stacker cranes, automated transports, baggage handling systems, and conveyor systems.

The LogiDrive package provides a complete drive solution tailored to specific system needs. Not only does the modular design provide versatile arrangements, but it also reduces the number of variants, saves money in Total Cost of Ownership (TCO), and allows for each unit to be individually serviced–minimizing maintenance, downtime, and repair costs.

(Courtesy of ISA/NORD)

Introduction: The Birth of Industry 4.0 and Smart Manufacturing

Industry 4.0 is a paradigm shift in organizing and managing industrial businesses.

Industry 4.0 and smart manufacturing. What do these terms mean? Can they be used interchangeably or not?

It is nearly impossible to be in the manufacturing or the industrial automation industry and not have heard these buzzwords used in one form or another. They seem to be everywhere, actively discussed by thought leaders, industry experts, strategists, and company executives. They are written in mission statements and are even part of annual goals for a lot of companies, which gives the impression that everyone knows exactly what they are. But if you start asking people what the terms mean, they will either be honest and say, “I have an idea, but I don’t really know,” or they will give you an answer that is totally different from the next person’s.

And if that is the case, it would make using or achieving anything related to these concepts difficult, wouldn’t it?

The purpose of this special edition of InTech magazine is to help clarify these concepts by defining them, identifying the technology components, and explaining their relationship to one another and to your organization. Most importantly, we will answer the question: Why are these concepts such a big deal right now?

The birth of Industry 4.0

A timeline of manufacturing. Image 1: End of the 18th Century. Industry 1.0: Mechanization; Image 2: Start of the 20th Century. Industry 2.0: Electrification; Image 3: Start of the 1970. Industry 3.0: Automatization; Image 4: Present. Industry 4.0: Cyber-Physical Systems. To understand the fourth industrial revolution, it helps to know the names and timeline of the first three.

Industry 4.0 (known as “Industrie 4.0” in Europe) was brought to life as a term and a concept in 2011 at Hannover MESSE, where Bosch described the widespread integration of information and communication technology in industrial production. The entire manufacturing industry, along with the German government, took interest in this idea.

After Industry 4.0 was introduced, the idea turned into the “High-Tech Strategy 2020” action plan in 2012 by the German government. This idea took hold, and soon dozens of other governments developed their own initiatives, all similar in purpose, but different in execution and scope. 

China developed “Made in China 2025” to fully modernize the country’s manufacturing industry. The United Kingdom introduced its “Future of Manufacturing” in 2013; the European Union developed its “Factories of the future” in 2014; Singapore came out with its “RIE2020” plan; and yes, the U.S., in 2014, launched the “Manufacturing USA” initiative that created a network of 16 member institutes. Each of the institutes focuses on specific advanced manufacturing technology. They each pull together private-sector companies, academic institutions, and other stakeholders to pursue collaborative research and development, test applications, train workers, and reduce the risks associated with deploying new technologies.

A working group on Industry 4.0 was formed, led by Bosch executive Siegfried Dais and Henning Kagermann, the former chairman and CEO of SAP and president of the German National Academy of Science and Engineering. In 2013, this working group presented a set of Industry 4.0 implementation recommendations to the German federal government. From that moment forward, the fourth industrial revolution had begun, and the working group members were recognized as the founding fathers and driving force behind Industry 4.0.

An 85-page paper developed by the Industry 4.0 working group starts off by explaining how we are entering the fourth industrial revolution—hence the reference to “4” in “Industry 4.0.” To understand the fourth industrial revolution, it helps to remember the first three, and how we got to this point (figure). At the end of the 18th century, the first industrial revolution involved mechanization—using water and steam to increase production beyond that of manual labor. It can be represented by the introduction of the first mechanical loom in 1784. The second industrial revolution saw the development of assembly lines powered by electricity. Electrification typified Industry 2.0, which continued through the start of the 20th century.

Industry 3.0 introduced electronics and computers to replace manual processes. The dawning of this era of “automatization,” according to the Industry 4.0 working group paper, could be represented by the introduction of the first programmable logic controller, the Modicon 084.

Our present era, Industry 4.0, is known as the era of cyber-physical systems—the convergence of physical, digital, and virtual systems and the rise of the Internet of Things (IoT). Industrial IoT (IIoT) emphasizes manufacturing IoT as distinct from retail/consumer, medical, or other IoT devices or architectures. Industry 3.0 is about automation—the reduction of human intervention in processes. Industry 4.0 is about cognition or the process of acquiring knowledge and understanding. These two are separated by the ability to properly capture and harness the power of data.

Trying to define Industry 4.0

Industry 4.0 is not merely a matter of connecting machines and products through the Internet. It encompasses a wide range of advanced technologies, such as digital twins, artificial intelligence, high-speed wireless networks, deterministic wired networks, cloud and edge computing, and virtualization technologies like augmented reality. It is also a paradigm shift in how we organize, manage, and approach business to make the most of cyber-physical systems.

The working group characterized Industry 4.0 as a concept that is focused on creating smart products, smart procedures and processes, and smart factories. But that statement is so grandiose and vague that it is almost no help. With all that visionary talk, we can easily get excited and energized, but we still do not have a definition. The Industry 4.0 working group did not really provide one.

Over the past nine years, people have latched onto the concept of Industry 4.0. Each country attempted to define it in its context as it saw fit, which of course meant different ideas everywhere. Several years after the working group convened, two of the largest standards bodies, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC), got together and formed a joint working group called JWG21. Its main intent was to define the concept of Industry 4.0. In the middle of 2021, the JWG21 finally established a definition. For myriad reasons, the term “smart manufacturing” was selected instead of “Industry 4.0.” The group felt it better represented a global viewpoint.

Here is the current formal definition of smart manufacturing:

Manufacturing improves its performance aspects with integrated and intelligent use of processes and resources in cyber, physical, and human spheres to create and deliver products and services, which also collaborates with other domains within enterprises’ value chains. 

  • Note 1: Performance aspects include agility, efficiency, safety, security, sustainability, or any other performance indicators identified by the enterprise. 
  • Note 2: In addition to manufacturing, other enterprise domains can include engineering, logistics, marketing, procurement, sales, or any other domains identified by the enterprise.

As a society, we are starting to feel the impacts of Industry 4.0 already. Not only are companies investing, but governments around the world are pouring a lot of money into this idea as the way of the future. Smart manufacturing promises improved performance through the digital transformation of manual and mechanical systems, and the further integration of automated systems with business systems and advanced technologies. We all are in the midst of this paradigm shift and are being compelled to move our companies forward. The birth of Industry 4.0 is giving way to growth and change, asking us to help move our companies toward whatever the next revolution might bring.

Jeff Winter is an industry executive for manufacturing at Microsoft. Winter is also part of the leadership committee of the Smart Manufacturing & IIoT Division of ISA, a contributor to IEC as a member of TC 65, on the board of directors with the Manufacturing Enterprise Solutions Association (MESA), and benchmarking chair with Control System Integrators Association.

Industrial IIoT: Transforming Operations with Data Communications

Industrial IIoT: Transforming Operations with Data Communications
Industrial IIoT: Transforming Operations with Data Communications

The smartness of a factory lies in its ability to make optimal and timely decisions.

The Internet of Things (IoT) is a recent concept that originated out of the advancement and penetration of Internet services. The term IoT was first used by Kevin Ashton in 1999 to promote radio frequency identification (RFID) technology. It means smart devices with some computational capability are connected to the Internet and support sharing data in formats that can be used for further analysis. Day-to-day consumer examples of IoT devices include Alexa smart speakers, connected cars, and smart wearables.

When the application of IoT is extended to industrial use cases, it is called IIoT. It is also sometimes referred to as the “industrial Internet” or “Industry 4.0.” IIoT originated out of the commercial concept of the Internet of Things and the advancement and penetration of Internet services within industrial environments. The term “industrial Internet” reportedly was coined by GE for the convergence of critical assets, advanced predictive and prescriptive analytics, and modern industrial workers.

IIoT is a network of smart sensors, actuators, and systems using communication technologies that help in the real-time analysis and communication of data produced by the devices in the factories or fields. The ability to gather real-time data enables monitoring, exchange, and analysis of the data for meaningful insights. These insights are harbingers of smarter and faster business decision-making for manufacturing organizations.

In general, an IIoT ecosystem consists of the following (Figure 1):

  • connected smart devices that gather and communicate over a network
  • a public or private communications infrastructure
  • processes that analyze the data gathered by smart devices and produce business use information
  • data storage that houses data in a central location
  • people that consume the information to make informed decisions.
Figure 1: Industrial Internet of Things

IIoT enables the true convergence of information technology (IT) and operational technology (OT). The smart edge devices in the field or factories communicate the captured data intelligently over the communication infrastructure. The data is consumed to drive actionable information and trend analysis for machinery. The analyzed information supports informed decision-making for predictive maintenance, safety, security, and business optimization. 

An IIoT system with IT-OT convergence can be seen as a layered modular architecture of digital technology (Figure 2). It can be divided into four technology layers: 

Figure 2: A layered modular architecture of digital technology.
  • Content layer: People-interface devices like computer screens, tablets, smart glasses, and smart surfaces
  • Service layer: Applications and software to analyze data and transform it into actionable, insightful information
  • Network layer: Communication infrastructures such as Wi-Fi, Bluetooth, LoRa, 4G/5G cellular, and other methods that send and receive the data
  • Device layer: Smart edge devices, cyber-physical systems, sensors, actuators, and machines.

Trends and Standards

The growing adoption and multitude of players in the IIoT space have necessitated the development of standards. In 2020, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) jointly released three IoT standards: 

  • ISO/IEC 21823-2 specifies a framework for transport interoperability to enable information exchange within and between IoT systems. 
  • ISO/IEC TR 30164 describes the concepts, characteristics, and technologies of edge computing for IoT systems applications. 
  • ISO/IEC TR 30166 applies to IIoT systems and landscapes.

In addition, the Industrial IoT Consortium has developed several volumes of architecture and specifications for IIoT. Sixteen consortia and associations and 17 standards development organizations are helping to define and standardize the IIoT environment.

An abundance of communication standards frameworks also exist. These include MQTT, a bidirectional TCP/IP–based publish-subscribe communication protocol; REST, a scalable Hypertext Transfer protocol used for edge-to-cloud communication; NodeRED, an open-source platform developed by IBM to connect APIs, hardware, and online services; OPC, a series of standards developed by the OPC Foundation for industrial communication to connect controllers with computers and the cloud; Chatty Things, an open framework being developed by XMPP Standards Foundation for scalable IIoT infrastructure; Cognitive IoT, a framework being developed by IBM that combines IoT with machine intelligence, contextual information, and learning using natural language processing; and Mindsphere, a cloud-based platform developed by Siemens to integrate IoT edge devices, applications, and services in one place. 

These standards and frameworks are shaping the IIoT landscape, as are Industry 4.0 reference architectures being developed around the world. Industrial Internet Reference Architecture (IIRA), the German Industrie 4.0, and the RAMI model are all independent efforts to create a defined standard for IIoT-enabled facilities.

Why IIoT is important now

IIoT has affected the industrial sector significantly and brought many benefits to digital manufacturing. The advancement in IoT technologies and the availability of the Internet have helped other advanced technologies, such as cloud computing, big data analytics, and artificial intelligence/machine learning, penetrate the industry. This has comprehensively contributed to a robust infrastructure for cyber-physical systems. The traditional industrial systems like supervisory control and data acquisition and distributed control systems have improved in monitoring, performance, productivity, and, more importantly, in efficiency with the advent of IIoT, contributing to the profitability of organizations.

The convergence of the physical devices over network infrastructures with smart edge devices, real-time analysis of data from the production process, visibility into process parameters, control of processes, and data exchange have all improved significantly. Overall, IIoT has enabled data-driven decision-making and positively affected the accuracy and predictability of these decisions in industrial environments.

Trends and benefits enabled by IIoT

Digitalization enabled by IIoT has been increasing rapidly during the past decade. According to a survey conducted by statista in 2020, the global market for IIoT was more than 263 billion USD. The market is expected to grow to some 1.11 trillion USD by 2028.

During the COVID pandemic, the adoption of IIoT-enabled technologies increased significantly. The remote work requirements were a push factor of this adoption. The focus of IIoT implementation in recent years has been workforce management initiatives, automation, and customer experience improvements. 

The advanced applications brought about by the proliferation of IIoT technology is the most significant trend associated with IIoT. This list was compiled by ATS

  • Remote monitoring and operation: The advantage of sensor-based data analytics is to access data and devices on demand.
  • Edge sensor advances: The edge capability and penetration of 4G/5G communications have enabled faster communication and robust sensors.
  • Predictive analysis: Data-driven trend analysis has improved on-time maintenance, reducing downtime and increasing production.
  • Digital twins: Smart sensor data feeds into digital twin models and makes remote monitoring and management more reliable and efficient.
  • Health and safety: IIoT-driven technologies contributed to health and safety, especially during the pandemic time in 2020–21. Employee locations on the facility floor, tracking of close contacts, and temperature recording all contributed to safe and healthy work environments.
  • Agile and flexible infrastructures: IIoT advances provide unprecedented flexibility in areas such as supply chain, so manufacturers can be agile in supplier selection, ordering and procurement strategy, and inventory management.
  • Smart factory: Increased penetration and use of 5G wireless communications within factories are taking digital manufacturing in new directions. Smart factories are a reality in 2022.
  • Data analysis at the source: The abundance of data generated by the smart edge devices makes it important to analyze the data at the source in a timely manner. Factories are changing their technical architecture, bringing data analysis and artificial intelligence (AI) technology out to the “edge” to take full advantage of the IIoT ecosystem.

The integration of IT and OT increased the speeds of Internet and communication technologies, and fast data analysis has supported the conversion of digital factories into smart factories. IIoT platforms integrate IT functions with OT functions and transform factory floor operations. The legacy machines and sensors are being integrated with the IT systems, and edge intelligence has been introduced. Lately, 5G penetration has accelerated this transformation by eliminating cabling and enabling ultra-reliable, mission-critical wireless communications.

The smartness of a factory lies in its ability to make optimal and timely decisions. Humans may or may not be part of such decisions. IIoT makes highly advanced technologies possible. For example, companies are adopting robotics and unmanned autonomous vehicles at deeper levels to augment or replace human workloads. Machine learning and artificial intelligence is being used to analyze data gathered by sensors and monitoring devices to make real-time decisions and improve the efficiency of production.

The huge operational advantages of smart factories are beginning to be realized as the pace of AI-driven process intelligence, blockchain-enabled supply chain management, and crypto-enabled edge security picks up. In the next few years, we will see the true transformation of factories as technologies like digital twins, the industrial metaverse, token-based economies, and algorithmic trust are enabled by lightweight edge computing. IIoT enables all that and more.

This feature originally appeared in InTech magazine’s August issue, a special edition from ISA’s Smart Manufacturing and IIoT Division.

About The Author

Shiv Kataria leads the IIoT committee within ISA’s SMIIoT Division. One of the focuses of this committee is to document best practices and frameworks and share the knowledge throughout the automation community. The use cases shared by the committee are from professionals working in IIoT areas who have been implementing IIoT for the digitalization of factories. Kataria is a research professional with Siemens Cybersecurity and Trust Research India. He is an electronics and communication graduate and holds certifications including CISSP, ISA/IEC 62443 Cybersecurity Expert, CEH, and ISMS 27001.

(Courtesy of ISA/IIoT and Author: Shiv Kataria)

Acromag’s Ethernet Remote I/O Modules Add Conditional Logic Computing

Acromag’s Ethernet Remote I/O Modules Add Conditional Logic Computing
Acromag’s Ethernet Remote I/O Modules Add Conditional Logic Computing

Acromag’s new BusWorks NT series remote I/O modules are now enhanced with conditional logic. The conditional logic increases the functionality with a system of rules that allows extremely complicated decisions based on relatively simple “yes/no” questions. For example, reading an analog or digital input value can trigger an action to happen as a result. This value could control a relay when one or more conditions occur. Another example would be when a discrete input is ON and a temperature threshold is crossed. More complex math computation and logic are also an option.

Conditional logic is configured on the NT’s built-in web configuration page. No programming is required. The modules support up to 64 conditions using IF/THEN/ELSE statements.

NTE Ethernet I/O models have dual RJ45 ports and a webserver with Modbus TCP/IP and Ethernet/IP communication to monitor or control the internal I/O channels. An integrated DIN rail bus allows connections of up to three NTX expansion I/O modules. Each I/O module adds up to 16 input or output signals allowing a mix of voltage, current, temperature, TTL, and relay control signals networked on one IP address. The space-saving design requires only 25mm of DIN rail per module. Ethernet I/O modules distribute 9-32V DC power along the DIN rail bus to expansion modules. Hazardous location approvals, high noise immunity, and -40 to 70°C operation make this I/O ideal for use in harsh environments.

“With this new conditional logic capability, users can easily implement rule-based control functions without programming,” stated Robert Greenfield, Acromag’s business development manager.

The NT2000 Series offers a wide variety of I/O signal processing options. Nine I/O configurations are available as either NTE Ethernet I/O or NTX expansion I/O models. Analog I/O models feature eight differential or sixteen single-ended inputs for monitoring current or voltage signals. Discrete I/O models provide 16 tandem input/output channels with either active high/low input or sinking/sourcing output. A six-channel mechanical relay output model is also available. For temperature monitoring, a thermocouple input model supports many sensor types and also millivolt ranges. More models will release over the coming months for additional I/O functions.

Profinet communication is planned for release soon. Each module will support all three protocols which are selectable using any web browser to configure the network settings and I/O operation. The modules typically function as a network slave, but also offer Acromag’s i2o peer-to-peer communication technology to transfer data between modules directly without a host or master in between. Multicast capability is included.

Acromag, a mid-sized international corporation, has been developing and manufacturing measurement and control products for over 60 years. They offer a complete line of industrial I/O products including process instruments, signal conditioning equipment, data acquisition boards, distributed I/O systems, and communication devices.

(Courtesy of ISA/Acromag)

Softing’s SmartLink Product Family Offers Extended Functionalities for Plant Asset Management

Softing's SmartLink Product Family Offers Extended Functionalities for Plant Asset Management
Softing’s SmartLink Product Family Offers Extended Functionalities for Plant Asset Management

The SmartLink product family from Softing enables end customers to make efficient use of connectivity at the interface between OT and IT. The products allow simple and scalable integration of device data into plant asset management applications. The new versions of SmartLink HW-DP v1.20 and SmartLink SW-HT v1.20, which are now available, offer enhanced data transfer and connectivity functionalities.

SmartLink HW-DP – Integration of Industry 4.0 applications in PROFIBUS and HART systems

SmartLink HW-DP enables access to the process, asset, and diagnostic data from PROFIBUS devices and HART devices connected to PROFIBUS remote I/Os, as well as secure export to any system inside and outside the user’s own network. The new version v1.20 now supports providing asset and diagnostic data from field devices via MQTT. This allows easy integration into typical IoT system architectures, such as the Namur Open Architecture (NOA) or the IoT reference architectures of large cloud platforms.

SmartLink SW-HT – Data access via Emerson AMS Device Manager and other HART IP-enabled plant asset management applications

SmartLink SW-HT allows access to configuration and diagnostic data via Emerson’s AMS Device Manager or other HART IP-enabled Plant Asset Management applications. As the only solution available on the market, SmartLink SW-HT has so far supported Schneider Electric M580 controllers and drop I/Os as well as Allen-Bradley controllers and remote I/Os. Version v1.20 now also connects Emerson AMS Device Manager to HART devices connected to R.Stahl IS1+ remote I/O. More and more modern remote IOs are using Ethernet to connect to the controller. SmartLink SW-HT takes this trend into account by providing an Ethernet connection for tunneling HART commands to remote IOs. As a Docker container, SmartLink SW-HT can be managed via Kubernetes-based management platforms or services of the major cloud platforms.

Both new product versions expand the possibilities for end customers to implement open, standards-based, and scalable system architectures with the SmartLink product family and to integrate connectivity into ITmanaged edge solutions.

(Courtesy of ISA/Softing Inc.)

Five Keys to Powering Remote Wireless Devices

Five Keys to Powering Remote Wireless Devices

Battery-powered remote wireless devices are taking industrial automation to increasingly remote locations and extreme environments. The growing list of applications includes supervisory control and data acquisition (SCADA), process control, asset tracking and management, safety systems, field equipment status, flow monitoring, machine-to-machine (M2M), and artificial intelligence (AI), and wireless mesh networks.

Identifying the ideal power source for a remote wireless device requires a fundamental understating of each application’s unique power requirements, then selecting the ideal battery based on its performance capabilities.

This decision-making process typically centers around five key considerations:

  • Evaluating the device’s specific energy demands
  • Choosing the battery chemistry that best suits the needs
  • Understanding the importance of battery self-discharge
  • Adapting to high pulse requirements
  • Doing your homework

Evaluating device energy requirements

If a wireless device is easily accessible and operates within a reasonably mild temperature range, it may allow for the use of an inexpensive consumer-grade alkaline or lithium battery. However, the performance requirements for a battery are far different for long-term deployments in hard-to-access and hostile environments.

These devices must conserve energy by operating mainly in a standby state, drawing microamps of average current with periodic high pulses in the multi-amp range to power wireless communications. These low-power devices are predominantly powered by industrial-grade lithium thionyl chloride (LiSOCl2 ) batteries (Figure 1) that feature very high capacity, high energy density, an extended temperature range, and an exceptionally low annual self-discharge rate.

A relatively small number of remote wireless devices draw milliamps of average current with pulses in the multi-amp range, draining enough average energy to prematurely exhaust a primary (non rechargeable) battery. These niche applications are often better suited for an energy harvesting device in combination with an industrial grade Lithium-ion (Li-ion) battery to generate high pulses.

Choosing the best battery type for the application

Numerous primary lithium battery chemistries are available (Table 1), each offering unique advantages and disadvantages. At one end of the spectrum are inexpensive alkaline batteries that deliver high continuous energy but suffer from a very high self-discharge rate (which limits battery life) as well as low capacity and energy density (which adds size and bulk). In addition to being short-lived, consumer-grade alkaline cells cannot operate in extreme temperatures due to their water-based constituents. For this reason, many remote wireless devices are powered by industrial-grade lithium batteries.

Figure 1: Bobbin-type LiSOCl2 batteries are preferred for remote wireless applications, delivering high energy density, up to 40- year service life, and the widest possible temperature range, making them ideal for use in inaccessible locations and extreme environments.
Table 1: Numerous primary lithium battery chemistries are available.

As the lightest non-gaseous metal, lithium features an intrinsic negative potential that exceeds all other metals, delivering the highest specific energy (energy per unit weight), highest energy density (energy per unit volume), and higher voltage (OCV) ranging from 2.7 to 3.6 V. Lithium battery chemistries are also non-aqueous and therefore less likely to freeze in very cold temperatures.

Among all commercially available primary lithium chemistries, bobbin-type lithium thionyl chloride (LiSOCl2 ) stands apart as being overwhelmingly preferred for ultra-long-term deployments. Bobbin type LiSOCl2 chemistry delivers the highest capacity and highest energy density of all endures extreme temperatures (-80°C to 125°C), and features an annual self-discharge rate as low as 0.7 percent per year that enables up to 40-year battery life. Bobbin-type LiSOCl2 batteries are specifically designed for use with low-power communications protocols such as WirelessHART, ZigBee, and LoRa, to name a few.

The main performance benefits of bobbin-type LiSOCl2 batteries include:

  • Higher reliability: Ideal for remote locations where battery replacement is difficult or impossible and highly reliable connectivity is required.
  • Long operating life: Since the battery’s self-discharge rate often exceeds actual energy usage, high initial capacity and a low self-discharge rate are often critical.
  • The widest temperature range: Bobbin-type LiSOCl2 cells can be modified to work reliably in extreme temperatures (-80°C to 125°C).
  • Smaller size: Higher energy density could permit the use of smaller batteries.
  • Higher voltage: Could allow for the use of fewer cells.
  • Lower lifetime costs: A critical consideration since the manpower and logistical expenses to replace a battery far exceed its cost.

Importance of battery self-discharge

A remote wireless device is only as reliable as its battery, so design engineers must specify the ideal power source based on a number of factors, including the amount of energy consumed in active mode (including the size, duration, and frequency of pulses); energy consumed in standby mode (the base current); storage time (as normal self-discharge during storage diminishes capacity); thermal environments (including storage and in-field operation); equipment cut-off voltage (as the battery capacity is exhausted, or in extreme temperatures, the voltage can drop to a point too low for the sensor to operate). Often, the most critical consideration can be the battery’s annual self-discharge rate, as the amount of current consumed by self-discharge can exceed the amount of energy required to operate the device.

All batteries experience some amount of self-discharge as chemical reactions draw current even while the cell is unused or disconnected. Self-discharge can be minimized by controlling the passivation effect, whereby a thin film of lithium chloride (LiCl) forms on the surface of the lithium anode, separating it from the electrode to reduce the chemical reactions that cause self-discharge. Whenever a current load is placed on the cell, the passivation layer causes initial high resistance and a temporary drop in voltage until the discharge reaction begins to dissipate the passivation layer—a process that continually repeats each time a load is applied.

Passivation can be affected by the cell’s current discharge capacity, the length of storage, storage temperature, discharge temperature, and prior discharge conditions, as partially discharging a cell and then removing the load increases the level of passivation over time. Controlling passivation is ideal for minimizing self-discharge but too much of it can overly restrict energy flow.

Competing bobbin-type LiSOCl2 cells vary considerably in terms of their self-discharge rate. For example, the highest quality LiSOCl2 batteries can feature a self-discharge rate as low as 0.7% per year, able to retain nearly 70 percent of their original capacity after 40 years.

Conversely, lower-quality LiSOCl2 cells can have a self-discharge rate as high as 3 percent per year, exhausting nearly 30 percent of their available capacity every 10 years, limiting maximum battery life to 10-15 years.

Adapt for high pulse requirements

To support two-way wireless communications and other advanced functionality, remote wireless devices must generate periodic high pulses up to 15 A. Standard bobbin-type LiSOCl2 cells normally cannot deliver high pulses due to their low-rate design. However, they can be easily modified with the addition of a patented hybrid layer capacitor (HLC) (figure 2). This hybrid solution uses the standard bobbin-type LiSOCl2 cell to deliver low-level background current during standby mode while the HLC delivers the high pulses required to support data queries and transmission. As an added benefit, the HLC features a unique end-of-life voltage plateau that can be interpreted to deliver low battery status alerts.

Supercapacitors perform a similar function to consumer products but are generally ill-suited for industrial applications due to serious limitations including short-duration power, linear discharge qualities that do not allow for the use of all available energy, low capacity, low energy density, and very high self-discharge rates up to 60 percent per year. Supercapacitors linked in series require the use of expensive cell-balancing circuits that add bulk and drain additional current to further shorten their operating life.

Figure 2: Bobbin-type liSOCl2 batteries can be combined with a patented hybrid layer capacitor (HLC) to deliver up to 40-year service life along with the high pulses required for two-way wireless communications.

Do your homework

When designing for long-term deployment in a highly remote location or extreme environment, it pays to spend a little more for a superior grade battery that can last for the entire lifetime of the device, thus eliminating the need for costly battery change-outs. Accomplishing this cost-saving goal requires careful due diligence as lithium batteries are not created equal.

For example, the annual self-discharge rate of a bobbin-type LiSOCl2 battery can vary significantly based on how it is manufactured and the quality of the raw materials. Unfortunately, a lower quality cell with a high self-discharge rate may be hard to distinguish as capacity losses are not easily measurable for years and theoretical battery life expectancy models tend to underestimate the passivation effect as well as long-term exposure to extreme temperatures.

To properly compare competing battery brands, users must demand fully documented and verifiable test results along with in-field performance data under similar loads and environmental conditions. Learning about the subtle differences between seemingly identical cells can pay huge dividends by reducing your long-term cost of ownership.

This feature originally appeared in the ebook Automation 2022: IIoT and Industry 4.0 (Volume 3).

(Courtesy of ISA/Tadiran Batteries and Author: Sol Jacobs)