8 minute read

CenturyPly

Reducing TAT & Improving Efficiency

With vast network of channel partners spread across the country, being catered from respective RDCs, across locations for their Deco vertical, there was a dire need for reduction in TAT for CenturyPly. The supply chain team came up with the idea of having a Central Distribution Center (CDC) to address all the possible issues related to TAT. Here’s how they achieved it…

Advertisement

CENTURYPLY has been the frontrunner in applying innovation at work. This simple philosophy has been the cornerstone of all its processes and technologies. It has led them to design and deliver contemporary lifestyle statements that have become synonymous with modern living. The company today is recognized as the largest seller of multi-use plywood and decorative veneers in the Indian organized plywood market.

CenturyPly laminate distribution network was direct from manufacturing plant to pan India distribution centers without having any central hub/ plant warehouse. This used to result in high inventory at warehouses with lower flexibility for managing demand fluctuations as replenishment lead time was high. Geographically spread across the country, from manufacturing plant at single location, it was observed that TAT is on higher side, as this included, PLT + QLT + TLT.

• PLT – Plant Lead Time for production of material

• QLT – Queuing Lead Time for FTL quantity

• TLT – Transit Lead Time for in-transit time taken to reach the designated RDC location

Further spiralling effect of problem was there due to the number of SKUs, which is approximately 1000+ numbers and production requirement has been generated basis consumption of SKUs at different RDC locations.

Challenges

With vast network of channel partners spread across the country, being catered from respective RDCs, across location for Deco vertical, it had become dire need for reduction in TAT. Increasing competition in the markets across, triggered requirement of availability of material at different RDC locations in full against placed orders and added to the above mentioned cause. At the same time, space and size of the RDCs remain unchanged, as the same locations have been in use to cater to both peak and low requirements in the market. Furthermore, increasing space is a costly affair subject to management approval. Service Level is another tangent, which couldn’t be ignored at the same point in time.

Faster replenishment to different RDCs, from single manufacturing plant location can only be achieved, if TAT and/or components of TAT are lowered. Highest operational efficiency had been touched for the components of TAT and further scope of improvement for those seemed very bleak.

APPROACH & METHODOLOGY

Consolidation of demand of different SKUs for production at plant in bulk/ large lot and single point of distribution from ready stock at plant seemed to be a step closer towards solution for addressing the challenges at hand. Current manufacturing plant was producing as per consumption of SKUs at different RDCs and didn’t have the provision to keep finished goods stock at the premise. It was decided by the team to build a Central Distribution Center at manufacturing plant to address the above mentioned issues. Firstly, an FG stocking location was identified with due approvals from management, then required system configuration was done in ERP and finished goods were stored in vertical racking.

Outcome

The SCM team came up with the idea of having a Central Distribution Center (CDC) to address all the possible issues related to TAT. This resulted in the reduction in TAT from requirement generation to dispatch from plant, as it is now happening from ready stock of finished goods, instead of production and then dispatch. Over 55% reduction in KM run was achieved for DCs, which used to get fed from RDC, and is now being catered from the plant directly. The company achieved 55% reduction in carbon emission and 45.6% reduction in cost due to direct feeding from plant for the DCs. The solution significantly improved sales as now TAT has been reduced and stock availability has gone up by 8%.

DTC (DIRECT TO CUSTOMER) PROJECT

Cost to serve is an increasing challenge for all supply chain function to address. With a view to reduce the same, Century ply identified an innovative way to save cost and increase service level. The team identified a need / possibility of direct delivery from manufacturing plants to Dealer / project site. They were serving those orders in smaller load from regional warehouses in multiple deliveries, resulting in dual handling and higher freight costs as the dealers were not willing to invest in bulk purchases.

The SCM team worked out threshold value for share the saving with dealer in case they lift higher load direct from plant. Success of this project majorly depends on the channel partners, as they have to place large lot of orders. Benefits, in terms of additional discount, have been passed on to the channel partners to make this project successful.

By encouraging direct delivery to channel partners from manufacturing plants, following have been aimed to achieve:

• Higher business volume with large lot of orders

• Economies of scale for freight by using higher capacity vehicles for dispatch

• Savings on other operational costs compared to delivery through secondary locations, e.g. Handling & Secondary Freight.

This project delivered excellent result and at present 30% of the volume is going directly to dealer, resulting in 4.3% annual supply chain cost saving and less handling direct dispatch is ensuring lesser TAT and chance of B&D.

Elanco Material Master Data Miner

With the aim of enhancing supply chain data quality, Elanco team conducted a series of brainstorming sessions, which was focused on two main aspects: data quality rule creation and the data quality improvement process. A data warehouse (Web-application integrated with data warehouse & Interactive dashboard) was established, leveraging SAP and Azure resources through the utilization of Delta live tables. Here’s presenting their success story…

ELANCO is an Animal Health organization, boasting an extensive portfolio of over 6800+ SKU-Market Combinations. With a widespread presence spanning manufacturing units and sales operations, both internally and externally, Elanco operates on a global scale. To ensure the integrity of critical data stored in SAP tables, maintaining millions of entries across key dimensions, including accuracy, completeness, and integrity, is of utmost importance. Managing the vast supply chain master data entails overseeing hundreds of fields, each governed by specific rules and interdependencies. However, due to limitations in data quality monitoring options, it becomes challenging to identify and program all the necessary rules. Consequently, the time and efforts required to conduct regular reviews of the data quality is significantly high.

Elanco follows the IPP (Innovation, Product & Portfolio) strategy, which encompasses a comprehensive approach in addressing the evolving needs of customers in the Animal Health landscape and delivering innovative solutions leveraging products, processes, and cutting-edge technologies to enhance animal well-being. One of the key successful transformation stories of Elanco supply chain team is Material Master Data Miner, a robust data quality management system. It's a unique and first of its kind ML based application, which not only measures the current quality but also helps to identify discrepancies based on the rules among them.

Challenges

1. Unable to measure Data Quality and Identify Problematic Entries: Existing approach of relying on BO reports or manual analysis on local machines falls short in providing a comprehensive solution for assessing data quality. This approach not only lacks a holistic view but also demands significant time investments.

2. Limited Agility in Adapting to New Rules: The current process of creating and implementing rules in existing system is time-consuming. Given the importance of regularly evaluating data, a more agile approach is crucial to swiftly respond to new rules and requirements.

3. Absence of a Process to Rectify Data Entries Resulting from Knowledge Gaps: A notable gap exists in terms of a structured process for correcting data entries that are created due to a lack of knowledge regarding rule creation. This gap hinders the accuracy and reliability of the data.

4. Investigating New Rules and Relationships within a Vast Data Set: Manual detection of rules and relationships among data fields poses an enormous challenge for data stewards, given the exceptionally large volume of data. The complexity and scale make it nearly impossible to accomplish this task manually.

5. Lack of a Standardized Approach to Data Quality Improvement: Establishing a standardized approach encompassing dedicated functionalities, from data creation to rule creation and maintenance, is crucial for fostering continuous improvement in data quality. Such an approach would streamline the entire process and facilitate consistent enhancements in data quality throughout the organization.

Approach

With the aim of enhancing supply chain data quality, a series of brainstorming sessions were conducted involving industry experts and senior leadership, leading to the creation of user stories. The initiative focuses on two main aspects: data quality rule creation and the data quality improvement process.

A data warehouse was established, leveraging SAP and Azure resources through the utilization of Delta live tables. In Azure, components such as data bricks, storage containers, and data factory were implemented for rule creation by identifying relationships among data points. These resources were seamlessly integrated with a webbased frontend application equipped with features enabling the creation, modification, editing, activation, and deactivation of rules. ML metrics are employed to prioritize rules based on model validation metrics such as lift, support, and confidence, while also aiding in identifying outliers and understanding the relationships between data fields.

Users are empowered to define user-defined rules specific to tables and measure field quality, facilitating the correction of data entries. To gain insights over tables with poor data quality, and problematic data entries, a live Microsoft Power BI dashboard was implemented. Key performance indicators (KPIs) on the dashboard provide valuable information on dimensions such as accuracy, completeness, and integrity. The dashboard's outputs are used for establishing a data quality improvement process, enabling the creation of an action plan. Data owners and stewards collaborate closely to rectify problematic entries, thereby improving overall data quality through a continuous improvement approach. Additionally, Ad-hoc periodical rule mining jobs can be scheduled using the frontend application. This comprehensive approach empowers Elanco to proactively address data quality concerns, ensure efficient decision-making, and drive continuous enhancements across their supply chain.

Outcome

As mentioned above, Material master data miner application is developed with key features which helps in:

Creating new rules - Each network team would have a network data steward (NDS) and data owners who is trained to run the rule mining engine to investigate new rules. These same people would have the ability to create user-defined rules. We would need to have governance overrule activation / change as what might seem like a great rule for one network might completely break another. The owner role would approve new rules / rule changes, ensuring they truly are written correctly and are suitable to apply globally. This helps us to create and implement new rules among data points and identify the outliers. The outliers can be analysed using PowerBl detailing why it is failed a particular rule and action plan to be developed. Rules can be activated/ deactivated/edited as per the data owners’ / stewards’ requirements.

Driving Data Quality – As a continuous improvement initiative, network team would have an NDS who takes ownership for the failed records for their network / affiliates, works with the ‘owner’ to ensure these really are problem records (not a rule problem), and then works with their team to get them fixed. NDS/ owners will run the scheduled mining jobs on periodic basis on combinations of data fields or tables or merge multiple tables into one to investigate new rules and real-time outlier identification and data correction. Material master data quality is improved to 96% for the tables in-scope post implementation of the application.

This article is from: