Compuprod Case Study

compuprod case study

The Compuprod case study for students at my local University college helped to determine the main communication flows within an example company.

Each group played a department within the company. The UML diagrams below result from the exercise and the class’s thinking.

The exercise demonstrated the necessity for communication, and for coherent systems to support it. Each actor requires and supplies a set of information to complete their job.

This interdependence reinforces the need to apply methods to ensure that the quality of data exchanged remains high to maintain efficiency within the organisation.

Process across the supply chain

The sequence diagram below illustrates communication between people in different departments.

The company context

The company is a large Japanese computer producer with an independent European direct sales force. It has production and distribution facilities in western France, Scotland and Malaysia.

In the context of the computer production market between 2000 and 2004, hindsight shows us that there was a general movement to outsource production facilities in Asia.

A Japanese company combined an American computer producer with a European one in difficulty with their own production facilities only to repatriate those facilities to Asia.

The company invested in antiquated green-screen systems. Peoplesoft replaced these multiple systems allowing global access to all core systems.

The project centred on the Purchasing department which managed 80% of the product cost.

The database project centralized Purchasing data into a coherent corporate database inline with the strategic information systems plan.

Over the space of four years, the database expanded to include Marketing forecasts, Production output and Key Performance Measurements.

Database functionality

A database model written as a UML sequence diagram helps transversal actors gain a common understanding of functional requirements.

Its initial function was to consolidate price data. It evolved into a demand aggregator and performance measurement tool.

Database requirements:

  • Consolidate forecast and production data from 14 sources.
  • Receive manual price input from buyers
  • Output key indicators: average supplier payment terms

Internal functions involve five main actors, three for the internal functions and two external – suppliers providing price and supply commitments and customers providing their demand forecasts.ERPs cover the transversal supply-chain functions and for good reason.

Senior management must own transversal functions to ensure that specifications include company-wide requirements.ERP development aims to standardise flows and speed up processing times between actors with specific information needs.

A written functional model provides a common understanding of system function.A UML sequence diagram across a purchasing supply chain

The consolidation mechanism:

  • Import Excel sheets into an Access table
  • Search and compare item descriptions against the database
  • If found attribute an identifier, if not create a new item
  • Copy temporary import tables into main tables

The consolidation process was in practice extremely complex. Around 14 sources were combined into one. Each source was different and could change at each submission.

There was insufficient management control to ensure that the participants respect a defined data template. The lack of management buy-in resulted in delays and extra effort on the part of the data engineer.

The complexity and variability of the production process caused problems in the reporting cycle. The credibility of the project suffered as a consequence of lengthy and complex data processing.

It was essential to produce quality data. The data engineer implemented significant quality checks to ensure high data quality. An example of this was to compare item descriptions with a central register, and any anomalies validated manually by experts.

Data pertinence

A long and complex data process is costly in terms of time and effort. Reports produced are also no longer real-time. Any further analysis must wait for the next batch of data processing.

Analysts should be analysing rather than spending time on creating the data. Sufficient ressources should be allocated so that data is made available quickly and potentially separate people do continuous analysis which feeds back into data production.

The KPI management process

The database was used for KPI (Key Performance Indicator) reporting. Reports were about supplier performance on payment terms, the number of suppliers.

If management uses the data against the people that produce it, the data will not be forthcoming. Managers asked buyers to provide data which demonstrated their performance. If the management attitude is inclusive and understanding there will be no problem. People will declare their data if they feel they are being helped to achieve their objectives. They will resist if management uses data to penalise.

People will be more willing to manage an involved data management process if the figures they produce are useful for their own work, not just for management reporting.

Summary:

  1. Initial Objectives: integrate buyer pricing and conditions.
  2. Secondary objectives: measure improvement in conditions and use planning data to negotiate conditions
  3. Tertiary objectives: Use the centralized database to migrate to a global information system: Peoplesoft.

Critique:

Users did not have sufficient input into the requirements analysis process. They did not have sufficient input into reports and outputs.

Only very senior managers made decisions. Communication down the chain was not always open and clear.

Incremental database design: the initial project brief did not include the above objectives.

The initial design did not include the full set of functionality. The database changed many times with insufficient change management.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.