The PLM Fashion Report 2023: Why Integrations to PLM are driving digital transformation in Fashion

BY CHRIS JONES | JUNE 15, 2023 | 11 MINUTE READ     

It's here! 🎉The Interline has published The Fashion PLM Report 2023.  

And the exciting news for JBSO Group is that Chris Jones has contributed to this year's report with an editorial discussing what it means to integrate data from an ever-expanding fashion technology ecosystem of tools and workflows with PLM and what it will take to enable a fully connected digital value chain to overcome the fashion industry's challenges. 

Download the report by clicking on the image.Read on for a recap of the editorial.

WHY INTEGRATIONS TO PLM ARE DRIVING DIGITAL TRANSFORMATION IN FASHION

AS FASHION TECHNOLOGY ECOSYSTEMS BECOME BROADER AND MORE COMPLEX, THE IMPORTANCE OF INTEGRATIONS IS INCREASING. BUT WHAT DOES IT MEAN TO INTEGRATE TOOLS AND WORKFLOWS THAT SUPPORT DIFFERENT AREAS OF THE PRODUCT LIFECYCLE WITH ONE CENTRALISED SOURCE OF TRUTH? WHAT WILL IT TAKE TO REALISE A FULLY CONNECTED VALUE CHAIN?

I'm honoured to be asked to write a piece for this prestigious report. I’ve been asked to discuss integrations with PLM. Or rather, explain why we need integrations, how many integrations may be required, and how to consider what integrations are essential to the business. Integrations within PLM solutions are not new. There have been many integrations with point solutions over the last 35 years within PDM and PLM. Most have been driven by individual use cases, often specified by the requirements of respective companies. However, only relatively recently have PLM vendors worked with other solution vendors to create integration frameworks for specific applications. 

WHY DO WE NEED TO INTEGRATE?

The fashion industry's supply chain is a complex web of interconnected solutions, processes, and stakeholders, contributing to the final product's cost, quality, sustainability, and speed to market. Currently, process and product data are generated in a range of diverse applications spread across supply chain partners, tiers, and geographies. Unfortunately, this data is often siloed, manually shared, or based on templates, making acquiring accurate product-specific supply data difficult and time-consuming. To optimise the complex factors within the supply chain for every product, businesses must embrace a data-driven approach that allows all roles in the supply chain to make informed decisions based on near real-time and accurate primary data for all process options across the supply chain.

THE RETAIL GALAXY

What does ‘integration’ look like? First, we must understand how data is created and when it’s required for decisions. We can use an astronomical analogy to describe the ‘Retail Galaxy’ as containing all the processes and data to support Planning, Merchandising, Design, Development, Sourcing, Costing, Manufacturing, Shipping, Warehousing, Distribution and Replenishment to Sales Channels to support our businesses. Of course, various applications and databases support all those processes and data, and often many vendor choices for each type of application.

We can separate this Retail Galaxy into ‘solar systems’. For simplicity, we can divide it into transactional and non-transactional processes and data, with PLM at the centre of non-transactional and ERP at the centre of transactional.

The PLM and ERP ‘solar systems’ are integrated to varying degrees via PLM and ERP. For example, PLM provides critical non-transactional data, such as product and material data, to ERP as a foundation for transactional processes and data.

We have 'planets' orbiting with the PLM data model at the centre of the PLM ‘solar system’. Each planet represents a process area comprising multiple unique processes, and each process with unique associated data. These process areas ‘planets’ could be described as Management, Merchandise Planning, Creative Design, Marketing, Consumer, Materials Development, Colour Development, Technical Development, Sourcing & RFQ and Environmental & Social Governance.

The PLM process and data ‘solar system’. Source: WhichPLM 

Within each ‘planet’, there are multiple processes and associated data. For example, for Creative Design, these processes could include Trend Analysis, Storyboard, Concept Development Manual, 2D vector design, 3D avatar & engineering design, 3D printing, and CAD knits, weaves, prints, plaids, stripes.

The data from all these processes are collected within the central PLM data model and shared to roles across all processes, using the mechanisms of collaboration, workflow, and automation.

THE REALITY OF THE PLM SOLAR SYSTEM

As defined above, there are ten ‘planets’ and more than sixty types of processes. A business will already perform these processes and collect the associated data if relevant to the product types and business model. The question is whether the process is supported by an application or performed manually and how the data is pushed to the central PLM data model. For most businesses, many of these sixty+ processes may be manually transferred to the central PLM database, if at all.

We wish to optimise the complex factors within the supply chain for every product via a data-driven approach that allows informed decisions across the supply chain. Then, we need to enable real-time data from every process to be available at the centre of our business PLM ‘solar system’. 

THE ANATOMY OF AN INTEGRATION

The critical questions for any integration are which application collects, manages and ‘owns’ the data (the Parent), where the data is to be shared (the Child), how the two will be connected, and how often the data will be transferred.

This data will be driving on-demand dashboards to highlight business performance and enable rapid data-based decisions through interrogation by AI/ML models. Therefore, for accurate and (near) real-time data sharing, Application Programmable Interfaces (APIs) should be used to map data, set frequency, and the trigger(s) for the integration. APIs can be open and available to all licensees or closed when developed for a specific purpose/client and subject to an exclusivity agreement. We’d recommend Open APIs so you trade short-term exclusivity with long-term evolution and support of the API.

INTEGRATION PURPOSE AND APPLICATION CAPABILITY

Business processes are supported by applications. However, not all applications are equal, so we need to understand the requirements of our business processes in tandem with the capabilities of the applications, to define what data and how to share. To illustrate this point, we can consider three examples of process data integrations and the issues to consider.

DEFINING THE ORIGIN OF THE DATA

Many PLM vendors state that their PLM applications have a Bill of Labour, but is that correct? For example, theremay be a table where operations can be manually populated or even a library where those operations can be populated for reuse. However, do they have pre-determined global standard libraries of time-motion operations supported by method codes generated by Predetermined Motion Time Study (PMTS) and calculation of a Method Standard defined by a methodology, including recognition by international bodies, such as the International Labour Organization?

There are several labour costing applications for fashion products that support workstudy engineering methodologies, operations libraries, and the training and certification for individuals and factories, which have received recognition from the ILO and other international bodies. Yet, it’s still not that simple, and the products supported also differ by application. For example, for three globally recognised labour costing applications and associated methodologies, GSDCost and timeSSD support apparel labour operations, whilst SATRA TimeLine supports footwear labour operations. In a business that produces apparel and footwear, two separate applications would be required to support the labour costing process and two different integrations to a standard data structure.

Therefore, the question to ask the PLM vendor may be: with which labour costing applications does the PLM application integrate to display, a) minimum of summarised labour costing for product variations, and b) the Bill of Labour (all operations, SAMs/SMVs and cost per factory) for each product’s supply variation? The answer will tell you whether the PLM application can display an accurate labour costing and if integration can be automated rather than manually updated. An informed decision can then be made based on the data and efficiency requirements of the business vs the capabilities, cost, budget, and timeline for elements of the connected solution.

AVAILABILITY OF THE LATEST DATA

When colour standards were initially offered in electronic form, any PLM vendor could enter an agreement and receive a file to import to their application, including updates with 100+ new colours for cotton and paper substrates every year 
 or two. However, the distribution of colour standards has evolved. Now, new colour standards are available immediately in ‘live’ updates. For example, the Pantone colour standard is supported by Pantone Live. Does the PLM or Creative Design application have the capability to ‘connect and forget’ to provide those new colours immediately for your creative design and colour teams?

DO APPLICATIONS AND DATA MODELS SUPPORT THE REQUIRED DATA

My final consideration relates to new use cases and data types. In this example, the driver is the focus on improving sustainability, and the regulations upon us, that require measurement of supply chain process variations using science-based primary data. The challenge is to capture accurate data for a defined process, which must be achieved for all processes across all tiers and all supply chain partners. This has led to the creation of new businesses that have focussed on this challenge. If we use the example of measuring Greenhouse Gases (GHG), supported by companies such as Made2Flow, millions of data points have been captured, with applications enabling the definition and comparison of product supply variations in terms of CO2e.

This is something that a PLM vendor can only replicate with enormous investment. Still, regarding the imminent sustainability legislation, integration is a straightforward and beneficial use case across the fashion industry. However, the capture of GHG data is fundamentally based on the breakdown of processes for every operation that generates CO2 emissions. Currently, the capability to capture processes in a Bill of Process (BoP) does not exist in any PLM application. This must be addressed to enable the details of supply chain processes and breakdown of CO2 emissions to be shared seamlessly with PLM. Without a BoP, each supply variation for a product must be either an attachment or summarised.

The summary of CO2e per supply variation may sound like an acceptable outcome until you consider the workflow. In this scenario, we rely on the correct process combinations populated in the GHG calculation application by a supply chain specialist, then calculated and pushed back to PLM for assessment by other teams alongside cost, margin, timeline, and demand estimates.

This becomes a slow and clumsy workflow. A streamlined workflow requires all supply options to be available to the design team at the earliest opportunity, whether picked from templates, or suggested by AI/ML, then calculated seamlessly by the integrated GHG application, and available immediately to the design team to enable informed decisions to provide improved sustainability, cost, and timeline, for both product and workflow.

PLM applications must evolve to include new functions and extended data models for deep and practical integration to enable this efficiency.

WHAT ARE THE ACTIONS REQUIRED?

After scratching the surface with basic examples for three of the 60+ unique process types and data sets from the PLM ‘solar system’, we can see many integration options that cannot be addressed simultaneously. What prioritised actions could be considered?

UNDERSTAND THE WORKFLOW TO UNDERSTAND THE TECHNOLOGY

A vital element of any successful implementation is that technology supports processes. If you have a good workflow process, you will have good efficiency and adoption
. and vice versa. An efficient workflow process provides the user with accurate data with minimum effort at the earliest opportunity to make an informed decision.

Brands, retailers, manufacturers, and suppliers already know the data required to make critical decisions within their workflow. They must understand their current and “best practice” workflow, where relevant data is available to support informed decisions at the earliest opportunity. This will form a blueprint for the prospect/customer to understand and prioritise their requirements to implement their best practice business workflow, including the prioritised timeline for data and technology to support. Instead of a long wish list of functions and a box-checking exercise, requirements will be clearly defined for software vendors and enable; a) an open discussion of technology capabilities and roadmap to support best practice workflow between prospect/customer and vendor and b) assist the vendor in prioritising their development and integration roadmap.

BUILD STRATEGIC PARTNERSHIPS

The argument for integrated data across processes, supply chain partners, and disparate applications to drive efficiencies, cost-savings and speed to market is familiar to the fashion industry. There are many examples of application-to-application integrations, but these have been made on a case-by-case basis to drive software sales, with use cases that are ‘low-hanging fruit’ or customer-specific problem statements.

Strong collaboration between brands, retailers,manufacturers, suppliers, and software vendors is essential for understanding primary data generation and integration required to share with decision-makers across the supply chain.

  • Brands, retailers, manufacturers, and suppliers must partner to map the supply chain processes, with specialist assistance to expedite progress.
  • Individual businesses should bring a deeper understanding of their current and ‘best practice’ workflow for technology evaluations. Ideally, industry best practices would be defined.
  • Software vendors must partner to enable the collection, integration, and visibility of primary data across the supply chain. A greater understanding of workflow from prospects & customers will allow prioritisation of the roadmap for integrations and new application functions and features to support them.

The industry must create genuine strategic partnerships sharing data and insights, where it’s accepted that an effective connected solution must be a seamlessly integrated collection of specialised applications.

CONCLUSION

Statements on investing in technology and integrating data to provide visibility across the supply chain are 30 years old, but what needs to change in the fashion industry? We shouldn’t expect altruism from every business in the fashion industry, yet we need to change some self-focussed behaviours. There are many areas to address to facilitate a complete digital value chain; a single company could not achieve this alone. Genuine partnerships must be created where each partner delivers their specialised solution element to the highest standard, and seamless data integration proves the sum is greater than the parts.

Reproduced with kind permission from The Interline.

CHRIS JONES, FOUNDER & DIRECTOR, JBSO GROUP

After originally training and working as an engineer, Chris joined a fashion services and technology company 30 years ago to implement ISO9001. Since then, he has helped over a hundred fashion brands, retailers, sourcing agents, and manufacturers to optimize their processes, supported by innovative technologies and concepts, working in offices, showrooms, and factories world-wide.