[Smarter Process] Material Preparation Transparency

Jui Han Lin
6 min readApr 24, 2021

In the Smarter Process Series, I reflect on projects I’ve led in my previous jobs, in the hope of reasoning out some principles or trains of thought for brainstorming, designing, and implementing improved processes for smarter supply chains.

In this project, I applied data analysis to pinpoint previously undefinable material preparation process efficiencies that involved multiple functions in the factory plant. As I did so, I realized that finding a cohesive database to start with is a challenge in itself. Here is how I tackled the problem:

Photo by Philippe Roy

Background

Every morning from 8:30 until the final SKU hit the line, our factory office was like a war room with project managers bustling about, urgently phoning our counterparts to track down the status of materials. Part of supervising the iPhone’s rigorous product development included ensuring that all 200+ SKUs in the BOM were ready for assembly by the start of every dayshift. Even short delays could result in hundreds of assembly workers idling, and exorbitant costs to the factory.

However, the reality was never less brutal. On a typical day, we needed to trace down around 20 SKUs late to the line. The number doubled or even tripled as we got closer to the end of a build or project, when leftover inventory was scarce. Sometimes that meant hours of stagnated production.

Chasing down a part was fundamentally complicated. The SKU could be under any of the six main processes, each branching out into many subprocesses managed by different functions:

An SKU received from the supplier is first sorted in a dedicated warehouse, then sent to one of 13 IQC rooms for a quality check (which includes passing through an indefinite number of workstations, depending on customer requests). Later, the acceptable parts are stored in the warehouse and assigned for use. Finally, designated SKUs are passed to the kitting rooms for allocation across 3–5 production lines.

Given the large number of parties and processes, variability in special requests from the customer, and tight time constraints, the efficiency of the long supply chain was always considered unmanageable, and the root causes of delays could never be clearly identified nor resolved. As a result, the team was stuck in a vicious cycle of resolving rather than preventing delays on a case-by-case basis, as our daily, frantic telephone calls indicated.

The poorly-managed process also invited functions to misstate their UPH (units per hour), claiming a throughput rate below full capacity. With no evidence to prove them wrong, the PM team found itself in a constant tug of war with the stakeholders whenever schedules fell behind.

I was inspired to devise a better solution to the black box operations.

Problem

I realized that if we could develop transparency in the material preparation process, we could then truly control operations. What we needed was a management framework to easily calculate efficiencies and spot bottlenecks to hold the lagging functions accountable.

Observation

Based on my experience applying analytics to abstract problems, I knew data analysis was the only solution; however, at that time we lacked an integrated database to track efficiency. For large-scale corporations such as ours, management power is often dispersed. Without unified management, individual functions can operate under performance goals that are misaligned with the values or overall good of a company and are unlikely to have a database built for that purpose.

During my frequent visits to the supply chain, I serendipitously uncovered an SAP database maintained specifically for security purposes. Our client cared immensely about confidentiality and thus asked that every key part in the form factor be scanned and traced throughout the production process, leaving a timestamp record for the entrance and departure of most SKUs to the various links in the supply chain. After reflecting on this, I realized the massive potential to re-interpret and re-imagine this data to creatively address a major operational issue.

Each yellow dot represents where an entrance or a departure record can be generated as SKUs travel through the different functions. An SKU transferring from one location to another will add two new rows to the SAP database (ex: SKU#999 Warehouse#1 -10pcs; SKU#999 IQC#3 +10pcs). Correctly identifying the routes that each batch of material took in a database appended by time was a big challenge.

It is worth noting that to utilize the database, I had to assume that account transfers are equivalent to material movements. It is possible that some physical goods may be transferred to fulfil an emergent request in the morning, but virtual accounts are not updated until the afternoon as work slows down. I checked with the system’s users to see whether that could be the case and found that this special circumstance only accounted for a small proportion of cases whose impact could be ignored.

Idea Validation

In order to learn the definitions and use cases, I visited the database’s key users individually. Every function has its unique perspective and approach to the account recording procedure, and great patience was required to piece together a comprehensive picture. However, gaining an in-depth understanding of the real practices proved to be invaluable later on, as I was able to explain the distinctive data patterns generated by each party’s unique approach to the same scanning task (which would have been confusing had I not studied the initial user behaviors.) This experience stressed the importance of marrying domain knowledge to a pure numerical mindset.

Another communication challenge arose when I explained the project’s purpose. Skepticism inevitably surfaced over whether my project might create extra work or unwanted KPIs. I thus quickly learned to communicate the long-term benefit to the stakeholders, take care to not provoke negative feelings, and understand their individual concerns. Stepping out of the cubicles to interact with each person was essential to breaking down barriers. Until that point, tackling individual efficiencies was unprecedented at the company; however, after winning buy-in from all of the stakeholders, we had an opportunity for real change.

Data Analysis

Eventually, I was able to collect three years’ worth (36M) of records from the MRP software for analysis. Distilling information from the dataset represented the bulk of the work. The analysis consisted of the following steps:

  1. Data cleansing: I excluded data beyond the relevant range (ex: material movements after entering the assembly line), and adjusted for process timeframes that covered holidays and weekends.
  2. Pattern extraction: Because the database was designed to be appended by time, I used VBA to extract material movement patterns and restructure the data rows at the SKU level.
  3. UPH calculation: The unit processing lead time could be calculated by using Excel to deduct the input time from the output time, then divide by batch size.
  4. Data interpretation: Grouping and taking the mode of the data points by category helped reveal procedural efficiencies. (Taking the mode was the most suitable way for this scenario because it trims the extreme outliers in each category, which are usually generated by special customer requests.) I also found the top performers for each category to use as baselines in the future.

Result

The results shed light on bottlenecks that the PM team’s previous intuition-based management methods had obscured. The analysis not only revealed the procedures that were falling short of expectations, but also areas in the supply chain where we could improve efficiency through minor alterations. For instance, the poor efficiency index of the Display IQC led us to investigate its operations and find that a simple upgrade to their old barcode scanners could increase the team’s overall efficiency two-fold. The need had been long overlooked because frontline workers had a limited voice. I immediately highlighted the need to the manager and resolved the issue within days. Ultimately, with these experiences as a template, we became more skilled at pinpointing urgent problems and prioritizing resources to address their root causes.

Tips for Tomorrow

§ Be aware of how your company’s governing structure creates gaps that prevent the organization from achieving truly optimized supply chains. Check for cohesiveness between the goals of each department. See this interesting article on how no perfect organizational archetype exists, but how end-to-end coordination can help:

§ Always remember to tie individual value to the organization’s value. Take some time out from the daily hustle to consider, “how can my role bring value to my company’s strategic goals?” For example, a project manager working for an OEM company should never lose sight of efficiency as the “ultimate goal” while making day-to-day decisions.

§ Participate and observe frontline actions carefully. Understanding real practices informs better supply chain decision-making.

Please leave a comment below and let me know your thoughts. If you would like to see more articles like this, please give me 10 claps!

--

--