Showing posts with label data supply chain. Show all posts
Showing posts with label data supply chain. Show all posts

Monday, 14 September 2015

Digital makes current supply chain models obsolete – What companies can do


The impact

Digital technology is disrupting traditional operation and now every business is a digital business. The impact on supply chain management is particularly great. Companies must re-invent their supply chain to unlock the full potential of digital.

Re-adapting is not enough. Digital is too different. Traditional governance mechanisms and business processes are too inflexible. Piecemeal digitization of supply chain elements is counterproductive. Instead it requires to re-imagine supply chains as integrated digital supply networks.

Digital technology renders traditional supply chain models obsolete. Digital can and will destroy business and operating models. It changes the nature of control points, the role & value of data. It shifts the level of value creation at each stage of the value chain.

Examples of successful digital companies

Coca Cola, for example, analyzes up to 1 quintillion (!) data points with their Black Book algorithms to assure consistent 12 month supply of their orange juice and makes manufacturing plans 15 months in advance based on external factors such as weather, expected crop yields and cost pressures.

Uber and Lyft leverage mobile technology to create secure temporary transportation by connecting drivers with riders.

Cisco Systems pronounced that supply chain and logistics constitute almost $3 trillion in value at stake (a combination of increased revenue and lower costs). This value can be gleaned by re-uniting product, talent, information and currency electronically via a digital supply network.

SMAC technologies are the most disruptive

Social media can help companies tap innovation from outside the organization, generate demand triggers for specific products & services (Physical supply chain), provide customization and community building through social channels and targeted product & service offerings (Information supply chain), solicit feedback and reduce selling costs (Financial supply chain).

Mobile communication provides real-time, 24 hour/7 connectivity, support for corporate field forces (Talent Supply Chain), offer store specific apps that drive demand (Physical Supply Chain) provide updates on product delivers (Information Supply Chain) and enable remote payments and new buying opportunities (Financial Supply Chain).

Analytics analyze employee performance & behavior and improve effectiveness & efficiencies (Talent Supply Chain), implement alerts & response actions and assist with predictive maintenance (Physical Supply Chain), understand customer behaviors that inspire new products, services and customization opportunities (Information Supply Chain) and help optimize procurement spend (Financial Supply Chain).

Cloud computing provides remote access for experts to help companies educate, train and solve problems (Talent supply Chain), leverage the contribution of partners & suppliers through portals hosted in the cloud (Physical supply chain), increase access to applications and crowd-sourcing opportunities (Information Supply Chain) and provide end-to-end source to pay functionality (Financial Supply Chain).
 

Digital technologies make supply chains Connected, Intelligent, Rapid and Scalable

Digital technologies enable networked process and optimization of the entire enterprise rather than just some individual functions. It unites all stakeholders across the value chain and inspires a new way of collaboration and innovation. It can help companies mass produce and mass-customize products & services at the same time.

Digital technologies make supply chains Connected, Intelligent, Rapid and Scalable

Becoming Connected gives companies real-time visibility, seamless collaboration within and beyond physical boundaries, and ability to adjust the product & service functions as well as business & operating models.

Arriba, for example, achieves a complete and seamless source-to-pay process by connecting 1 million suppliers and 4 million users in over 190 countries through the cloud.

Becoming Intelligent allows companies to leverage analytics, cognitive equipment and smart apps to turn data into valuable information, actionable insights, predictive decision making, automated execution (with seamless human-machine interactions), increased operational efficiency and enhanced, accelerated innovation.

Becoming Scalable companies can more easily optimize and duplicate processes, up/ down scale supply chains, add and reduce partners & suppliers; target niche markets, segments and customers more effectively. Digital plug-and-play capabilities make it easier to configure and re-configure. Channel-centric supply networks support customized products & services and personalized experiences.

Lockheed Martin Corporation developed Digital Tapestry and brought digital design to every stage of the production process. It includes 3-D virtual simulations for design and 3-D printing technologies for prototyping and production. The result is a less expensive, more reliable system in a completely artificial environment. Designers can manipulate parts or entire machines and see how they go together and operate. The system responds with a constant stream of automatically updated specifications.

Becoming Rapid – Speed is one of the most important currencies of the future. Digital technologies help diagnose, adjust and execute more rapidly and efficiently. Resources shift from within the company across the extended enterprise.

Enhanced responsiveness and sophisticated analytics help accelerate responses to changing demand, supply signals, competitor moves and technology shifts. Proactive prevention and predictive analytics can increase reliability and adaptability. Last mile postponement helps swiftly repurpose organizational assets and align supplies with evolving demands.

Dell launched a global command centers to monitor supply chain activities and make adjustments in real time. The platform links with the global data systems and monitors service dispatch activity, matching dispatch with optimal part location. It also serves as trend spotter, early warning and feedback system.
 

Key steps for companies to build their Digital Supply Network

The supply chain is evolving from a function concerned with the expedient movement of materials to an inter-enterprise discipline that concurrently optimizes materials, talent, resources, information and finances. Supply chain take a crucial role in the enterprise wide realization of outcome-focused missions like “create highly differentiated customer experience”, “achieve perfect order rates”, or accelerate innovation.

The new supply network is built with digital DNA. It is important to follow systematic process to transform a traditional supply chains into a digital supply network.

1)      Envision specific business outcomes for now and a decade into the future.

2)      Conduct a solid value chain analysis (see my blog on Prime Value Chain Analysis)

3)      Map your digital journey with a blue print of your future organization including the people, process, technology and governance aspects of the transformation. It should outline the convergence of talent, Physical, Information and Financial Supply chains into one cohesive network to assure a vibrant, interconnected ecosystem. Include a transformation plan from an existing technology landscape to a future digital one.

Key trends, aspects and opportunities to incorporate into your supply chain

Consider key aspects in your plan

·         Collaborative planning & scheduling - Optimize inventory holding and storage for delivery within hours to a customer for a premium fee.

·         Dynamic inventory and replenishment planning (based on real-time visibility across extended supply chain) for greater customer assortment, faster delivery and product flow streamlining.

·         Leverage external talent and infrastructure beyond customer boundaries: Leverage social networks, interest groups and customer product development forums to create new innovation; combine with up skilling internal employees to enable superior customer experience & service

·         Precision pricing based on collected on the ground and online intelligence and analytics.

·         Procurement mall – an online IT system & helpdesk provides intelligent choices for procurement and facilitates end-to-end procurement operations on a self-service basis. Allows merchants to access and apply best practices from across the global enterprise.

·         Transport planning based on supply side intelligence providing cloud based industrywide collaboration, internal real time demand visibility and dynamic route planning based on real time analytics.

·         Automated warehouse operations that connects people and IT systems on a real-time basis through smart equipment, RFID enabled warehouse picking systems, to increase accuracy and efficiency.

·         Micro-segmenting of customers to improve store layouts, customize offers and product mix based on customer behavior and social media data.

·         In-store/ off-line collaboration: Retailers could offer their physical infrastructure for a fee to the online market place and vice versa to provide customers with unprecedented access to assortments; plus establish on-demand access to inventory in the supply chain network.

·         Transport Cooperation to share transport with channel partners and even competitors. (Nestle and Coca Cola)

·         Shopper Insight – Customer preferences to drive product mix, promotions & sales through fast data analytics, alerts and in-store devices.

·         On-spot selling – arming in-store employees with customer-specific information, advice and upselling.

·         Movable supplies – Bring products closer to the customer to reduce delivery time based on demand sensing and movable warehouse capacity.

·         Last mile delivery – Consolidate deliveries across network, supply chain partners, other retailers and even competitors to reduce costs/ expedite delivery; use also pick-up lockers.

Special thanks & credits to Gary Hanifan, Aditya Sharma and Carrie Newberry of Accenture Strategy – Operations for their thought leadership and publications. This article has incorporated much of their content.


+++
To share your own thoughts or other best practices about this topic, please email me directly to alexwsteinberg (@) gmail.com.

Alternatively, you also may connect with me and become part of my professional network of Business, Digital, Technology & Sustainability experts at

https://www.linkedin.com/in/alexwsteinberg   or
Xing at https://www.xing.com/profile/Alex_Steinberg   or
Google+ at  https://plus.google.com/u/0/+AlexWSteinberg/posts


 
 

Saturday, 5 September 2015

Big Data & Analytics - the full view (upon request)

Upon request, I have put together the five parts of the previous published Big Data series and combined into one documents. This offers you to read everything in one place. Please share your thoughts and best practices with me. You always may email me directly to alexwsteinberg@gmail.com
Big Data Series - Part 1   Technical challenges
Big Data requires to learn much about data as an asset and analytics. Data is the most precious asset in an organization, the currency of the enterprise.
Companies’ data ecosystems have become complex and littered with silos. A large majority of companies is still not able to make full use of Big Data advantages.
There are many challenges with Big Data: Lack of knowledge, varying definitions & expectations, different views about data sources and use cases, ignorance about valuable data sources, technologies, etc.
Companies must understand data across the entire data supply chain and their individual stages: Identifying & leveraging data sources, importing, enhancement of data value, combination with other data, generation of insight, and taking of specific actions.
This means: companies must mobilize data across the enterprise; deeply understand, analyze and determine value of respective data; understand business use case and data patterns to determine appropriate actions.
It requires companies to commit to continuous discovery, experimentation, testing, learning, adapting and innovation.
There are many approaches, solutions and technologies presently offered in the Big Data domain and quickly evolving. Companies need to be aware of the different options and their pros & cons to combine those to an overall solution.
Continue part 2 out of 5  
Big Data Series – Part 2 - Traditional data approaches not enough anymore
Given the varying types, sources and sheer size of data today the traditional approach of collecting data in a staging area, transforming into desired format, loading in mainframe/ data ware house and then delivering requested data to users on a point by point query does not work well any more.
Companies must perform calculations, run simulations models, compare statistics at fast speed to generate insights. Real-time analytical tools able to pre-process streaming data and correlate data from internal and external sources, offer interesting opportunities, but also complex challenges.
Data acceleration enables massive amounts of data to be ingested, processed, stored, queried and accessed much faster. It ensures multiple ways for data to come into the company’s data infrastructure and be referenced fast.
Data acceleration leverages hardware and software power through clustering and helps correlate different data sources, including localization. It improves interactivity by enabling users and applications to connect to the data infrastructure in universally accepted ways and ensuring that user queries are delivered as quickly as required.
Continue part 3 out of 5
Big Data Series – Part 3 Six technology components for Data Acceleration
There are at least six key technology components to build a supporting architecture: Big Data platforms, Ingestion solutions, Complex event processing, In-memory databases, Cache clusters and Appliances. Each component helps with data movement (from source to where needed), processing and interactivity (the usability of the data infrastructure).
Big Data platform (BDP)
BDP is a distributed file system and compute engine. It contains a big data core, a computer cluster with distributed data storage and computing power. Replication and sharding partitions very large databases into smaller, more easily to manage parts in order to accelerate data storage.
Newer additions enable more powerful use of core memory as a high-speed data store. These improvements allow for in-memory computing. Streaming technologies added to the core can enable real-time complex event processing. In-memory analytics support better data interactivity.
Further enhancements to the big data core create fast and familiar interfaces with data on the cluster. The core stores structured and unstructured data, but requires map/reduce functionality to read. Query engine software enables the creation of structured data tables in the core and common query functionality (SQL etc.)
Ingestion
Collecting, capturing and moving data from its sources to underlying repositories used to be done traditionally through the extract, transform and load ETL method. Today the priority is not the structure of the data as it enters the system, but assuring that all data is gathered covering different increasing data types & sources and quickly transported to areas where it can be processed by users. Ingestion solutions cover both static and real-time data. The data the gathered by the publisher and then send to a buffer/ queue, where the user can request the data.
Complex Event Processing (CEP)
After data ingestion the CEP is responsible for preprocessing and aggregation (& triggering events). It tracks, analyzes and processes data of events and derives conclusions. CEP derives data from multiple sources and combines historic as well as fresh data in order to infer patterns and to understand complex circumstances. Its engines pre-process fresh data streams from its sources, expedite processing of future data batches, match data against pre-determined patterns and trigger events based on detected patterns.
CEP offers immediate insight and enables fast action taking. In-memory computation allows to run Data movement and processing in parallel, increasing speed. CEP solutions add computing power by processing the data before it is submitted to the data stores or file systems.
In-memory databases (IMDB)
IMDBs are faster than traditional databases, because they use simpler, internal algorithms and executive fewer central processing unit instructions. The database is preloaded from disk to memory. Accessing data in memory eliminates the seek-time involved in querying data on disk storage. The applications communicate through SQL, which receives records in the RAM and triggers the query optimizer.
IMDBs constrain the entire database to a single address space. Any data can be accessed within microseconds. The steadily falling RAM prices favor this solution.
Cache Clusters
They are clusters of servers in which memory is managed by a central software designed to transfer the load from upstream data sources (databases) to applications and users. They are typically maintained in-memory and can offer fast access to frequently accessed data. They sit between the data source and the user. Traditionally they accommodate simple operations such as reading and writing values. They are populated when a query is sent from a data user to the source. Prepopulating data into a cache cluster of frequently accessed data improves response time. Data grids can take caching a step forward by supporting more complex queries and using massive parallel processing (MPP) computations.
Appliance
Massive parallel processing sits between data access and data storage. Appliance here is a pre-configured set of hardware and software including servers, memory, storage, input/output channels, operating systems, DBMS, admin software and support services.
It may have a common database for online transactions and analytical processing, which improves the interactivity and speed. Appliances can perform complex processing on massive amounts of data.
Implementing and maintaining high performance data bases on clusters is challenging and few companies have the necessary expertise to do so themselves.
Custom-silicon circuit boards enable to develop their specific solutions. It enables development on devices for specific use cases and allows for network optimization (integrating embedded logic, memory, networking and process cores). This plug and play functionality offers interesting possibilities.
Continue part 4 out of 5
Big Data Series – Part 4 Creating a suitable Technology Stack/ Solution
All of these components bring their individual technology features. Companies must wisely put together an overall solution from among those components, leveraging their complementary advantages and customizing those to their particular needs.
There are four fundamental technology stacks (with their variations) offer possible solutions:
  1. Big data core only or with enhancements (with complex event processing, with in-memory database, with query engine or with complex event processing and query engine)
    • This technology is the de-facto standard for exceptional data movement, processing and interactivity.
    • Data usually enters the cluster through batch or streaming.
    • Events are not processed immediately, but in intervals. Enables parallel processing on large data sets, and thus advanced analytics.
    • Applications and services may access the core directly and deliver improved performance of large, unstructured data sets.
    • Adding CEP enhances big data core processing capabilities, real-time detection of patterns in data and trigger events. Enables real-time animated dashboards. Could add machine learning program to the CEP.
    • IMDB can further increase computing power through placing key data in RAM.
    • Query engines can further open interfaces for applications to access big data even faster.
  2. In-memory data base (IMDB) cluster only or with enhancements (with Big Data Platform, with complex event processing)
    • External data is streamed in or transferred as bulk to the IMDB
    • Users and applications can directly query the IMDB, usually through SQL like structures.
    • The incoming data is first pre-processed through the BDP before it goes to the IMDB
    • In case of CEP, the CEP first ingests the data; the processing is then done in the IMDB and then returned to the application for faster interactivity.
  3. Distributed Cache only or with enhancement (with Application and Big Data platform)
    • A simple caching stack sitting atop of the data source repository. The application retrieves the data. The most relevant data subset is placed in the cache.
    • Processing of the data falls to the application (may result in slower processing speeds)
    • If BDP, the BDP ingests the data from the source and does the bulk of the processing, then puts data subset in cache.
  4. Appliance only or with enhancement (with Big Data platform)
    • Data streams directly into the appliances; the application talks directly to the appliance
    • If BDP, the BDP ingests and processes data. The application can directly talk to the appliance for queries.
Continue part 5 out of 5

Big Data Series – Part 5 – 12 Immediate suggestions to build a data supply chain
  • Consider data as perhaps the most important asset in your organization. Become data driven. Some people call it “data religious”.
  • Research about Big data & Analytics best practices. It requires continuous learning. Refer to the different approaches offered in previous blogs (Data Acceleration Part 1 and 2).
  • Do an inventory of existing data. Focus on most frequently accessed and time-relevant data.
  • Identify, simplify and optimize inefficient data processes. Eliminate manual, time-consuming data curation processes (such as tagging and cleaning).
  • Identify currently unmet business needs and develop solutions.
  • Identify and overcome data silos.
  • Simplify and standardize data access through a robust data platform
  • Build an effective technology stack using one of the four suggested options while leveraging some of the described six components (Data Acceleration Part 1 and 2).
  • Further explore API management, traditional middleware, PaaS and other possibilities
  • Analyze current internal data sources and look for still hidden sources. Explore external sources to increase quantity and quality of available data.
  • Identify and improve individual data supply chain streams
  • Develop a systematic roadmap for building an effective overall data supply chain

Special thanks to Accenture Technology Labs and Analytics Group, whose thought leadership, best practices and white papers have served as inspiration and knowledge source for this Big Data series.

+++
To share your own thoughts or other best practices about this topic, please email me directly to alexwsteinberg (@) gmail.com.

Alternatively, you also may connect with me and become part of my professional network of Business, Digital, Technology & Sustainability experts at

https://www.linkedin.com/in/alexwsteinberg   or
Xing at https://www.xing.com/profile/Alex_Steinberg   or
Google+ at  https://plus.google.com/u/0/+AlexWSteinberg/posts


Thursday, 3 September 2015

Big Data Series – Part 5 – 12 immediate suggestions to build a data supply chain


Immediate suggestions to build a respective data supply chain

-         Consider data as perhaps the most important asset in your organization. Become data driven. Some people call it “data religious”.

-          Research about Big data & Analytics best practices. It requires continuous learning. Refer to the different approaches offered in previous blogs (Data Acceleration Part 1 and 2).

-          Do an inventory of existing data. Focus on most frequently accessed and time-relevant data.

-          Identify, simplify and optimize inefficient data processes. Eliminate manual, time-consuming data curation processes (such as tagging and cleaning).

-          Identify currently unmet business needs and develop solutions.

-          Identify and overcome data silos.

-          Simplify and standardize data access through a robust data platform

-          Build an effective technology stack using one of the four suggested options while leveraging some of the described six components (Data Acceleration Part 1 and 2).

-          Further explore API management, traditional middleware, PaaS and other possibilities

-          Analyze current internal data sources and look for still hidden sources. Explore external sources to increase quantity and quality of available data.

-          Identify and improve individual data supply chain streams

-          Develop a systematic roadmap for building an effective overall data supply chain

 Special thanks to Accenture Technology Labs and Analytics Group, whose thought leadership, best practices and white papers have served as inspiration and knowledge source for this Big Data series.

Big Data Series – Part 4 Creating a suitable Technology Stack/ Solution


All of these components bring their individual technology features. Companies must wisely put together an overall solution from among those components, leveraging their complementary advantages and customizing those to their particular needs.

There are four fundamental technology stacks (with their variations) offer possible solutions:

1.       Big data core only or with enhancements (with complex event processing, with in-memory database, with query engine or with complex event processing and query engine)

o   This technology is the de-facto standard for exceptional data movement, processing and interactivity.

o   Data usually enters the cluster through batch or streaming.

o   Events are not processed immediately, but in intervals. Enables parallel processing on large data sets, and thus advanced analytics.

o   Applications and services may access the core directly and deliver improved performance of large, unstructured data sets.

o   Adding CEP enhances big data core processing capabilities, real-time detection of patterns in data and trigger events. Enables real-time animated dashboards. Could add machine learning program to the CEP.

o   IMDB can further increase computing power through placing key data in RAM.

o   Query engines can further open interfaces for applications to access big data even faster.

2.       In-memory data base (IMDB) cluster only or with enhancements (with Big Data Platform, with complex event processing)

o   External data is streamed in or transferred as bulk to the IMDB

o   Users and applications can directly query the IMDB, usually through SQL like structures.

o   The incoming data is first pre-processed through the BDP before it goes to the IMDB

o   In case of CEP, the CEP first ingests the data; the processing is then done in the IMDB and then returned to the application for faster interactivity.

3.       Distributed Cache only or with enhancement (with Application and Big Data platform)

o   A simple caching stack sitting atop of the data source repository. The application retrieves the data. The most relevant data subset is placed in the cache.

o   Processing of the data falls to the application (may result in slower processing speeds)

o   If BDP, the BDP ingests the data from the source and does the bulk of the processing, then puts data subset in cache.

4.       Appliance only or with enhancement (with Big Data platform)

o   Data streams directly into the appliances; the application talks directly to the appliance

o   If BDP, the BDP ingests and processes data. The application can directly talk to the appliance for queries.

Continue part 5 out of 5

Big Data Series – Part 3 Six technology components for Data Acceleration


There are at least six key technology components to build a supporting architecture: Big Data platforms, Ingestion solutions, Complex event processing, In-memory databases, Cache clusters and Appliances. Each component helps with data movement (from source to where needed), processing and interactivity (the usability of the data infrastructure).

Big Data platform (BDP)

BDP is a distributed file system and compute engine. It contains a big data core, a computer cluster with distributed data storage and computing power. Replication and sharding partitions very large databases into smaller, more easily to manage parts in order to accelerate data storage.

Newer additions enable more powerful use of core memory as a high-speed data store. These improvements allow for in-memory computing. Streaming technologies added to the core can enable real-time complex event processing. In-memory analytics support better data interactivity.

Further enhancements to the big data core create fast and familiar interfaces with data on the cluster. The core stores structured and unstructured data, but requires map/reduce functionality to read. Query engine software enables the creation of structured data tables in the core and common query functionality (SQL etc.)

Ingestion

Collecting, capturing and moving data from its sources to underlying repositories used to be done traditionally through the extract, transform and load ETL method. Today the priority is not the structure of the data as it enters the system, but assuring that all data is gathered covering different increasing data types & sources and quickly transported to areas where it can be processed by users. Ingestion solutions cover both static and real-time data. The data the gathered by the publisher and then send to a buffer/ queue, where the user can request the data.

Complex Event Processing (CEP)

After data ingestion the CEP is responsible for preprocessing and aggregation (& triggering events). It tracks, analyzes and processes data of events and derives conclusions. CEP derives data from multiple sources and combines historic as well as fresh data in order to infer patterns and to understand complex circumstances. Its engines pre-process fresh data streams from its sources, expedite processing of future data batches, match data against pre-determined patterns and trigger events based on detected patterns.

CEP offers immediate insight and enables fast action taking. In-memory computation allows to run Data movement and processing in parallel, increasing speed. CEP solutions add computing power by processing the data before it is submitted to the data stores or file systems.

In-memory databases (IMDB)

IMDBs are faster than traditional databases, because they use simpler, internal algorithms and executive fewer central processing unit instructions. The database is preloaded from disk to memory. Accessing data in memory eliminates the seek-time involved in querying data on disk storage. The applications communicate through SQL, which receives records in the RAM and triggers the query optimizer.

IMDBs constrain the entire database to a single address space. Any data can be accessed within microseconds. The steadily falling RAM prices favor this solution.

Cache Clusters

They are clusters of servers in which memory is managed by a central software designed to transfer the load from upstream data sources (databases) to applications and users. They are typically maintained in-memory and can offer fast access to frequently accessed data. They sit between the data source and the user.  Traditionally they accommodate simple operations such as reading and writing values. They are populated when a query is sent from a data user to the source. Prepopulating data into a cache cluster of frequently accessed data improves response time. Data grids can take caching a step forward by supporting more complex queries and using massive parallel processing (MPP) computations.

 Appliance

Massive parallel processing sits between data access and data storage. Appliance here is a pre-configured set of hardware and software including servers, memory, storage, input/output channels, operating systems, DBMS, admin software and support services.

It may have a common database for online transactions and analytical processing, which improves the interactivity and speed.  Appliances can perform complex processing on massive amounts of data.

Implementing and maintaining high performance data bases on clusters is challenging and few companies have the necessary expertise to do so themselves.

Custom-silicon circuit boards enable to develop their specific solutions. It enables development on devices for specific use cases and allows for network optimization (integrating embedded logic, memory, networking and process cores). This plug and play functionality offers interesting possibilities.

Continue part 4 out of 5

Big Data Series - Part 1 Technical challenges


Big Data requires to learn much about data as an asset and analytics. Data is the most precious asset in an organization, the currency of the enterprise.

Companies’ data ecosystems have become complex and littered with silos. A large majority of companies is still not able to make full use of Big Data advantages.

There are many challenges with Big Data: Lack of knowledge, varying definitions & expectations, different views about data sources and use cases, ignorance about valuable data sources, technologies, etc.

Companies must understand data across the entire data supply chain and their individual stages: Identifying & leveraging data sources, importing, enhancement of data value, combination with other data, generation of insight, and taking of specific actions.

This means: companies must mobilize data across the enterprise; deeply understand, analyze and determine value of respective data; understand business use case and data patterns to determine appropriate actions.

It requires companies to commit to continuous discovery, experimentation, testing, learning, adapting and innovation.

There are many approaches, solutions and technologies presently offered in the Big Data domain and quickly evolving. Companies need to be aware of the different options and their pros & cons to combine those to an overall solution.
Continue part 2 out of 5    

Thursday, 16 July 2015

Digital Supply Chain - Framework and best practices


Digitalization is impacting (and disrupting) almost every industry. Supply Chain Management (SCM) is a prime candidate for value creation using digital thinking and technologies.

The complexity, scope and fluidity of today’s supply chain create enormous amounts of data, information and change. SCM must integrate, manage, optimize, change activities – ideally in real time (as the market and other factors require).

SCM covers both internal and external operations, covering clients, end customers, partners, suppliers, service providers on a global scope. Government and trade bodies, NGOs and other parties increase challenges. SCM deals with a huge number of divers stakeholders creating a complex web of contacts, roles, interests, flows of goods, services and information spanning.

In this blog, I would like to share thoughts for improving  and digitalizing supply chain management. Like always, I base my views on best practices. Here, I summarize and present methodology from Capgemini Consulting, that I regularly use for benchmarking:

Traditional supply chain models have resulted in rigid organizational structures, inaccessible data and fragmented relationships with partners. We often find a combination of numerous key deficits: Lack of transparency, agility, end-to-end process integration; sub-optimal use of locations and labor cost differences, bundling of tasks; overly complex IT landscapes.

Often several hundred applications supporting supply chain processes, lead to lengthy implementation cycles and overly high maintenance costs. Disparate IT systems bring in inconsistency and redundancy in data.

Digital supply chains are based on a digital operating model that implements digital capabilities along the organizational layers of governance, processes, data & performance management and IT. Such model enables business process automation, organizational flexibility and digital management of corporate assets.

Business Process Automation bears a value driver potential of on average 20 percent of the cost base. It integrates business processes, collaborates with customers and suppliers, has event driven process scenarios and embeds analytics/ optimization. It is about straight through processing, complete execution of end-to-end processes without the need for re-keying or manual intervention. All necessary data is available to employees to complete the transactions. Management of physical flows is enabled by a closely knit web of checkpoints that are tracked and monitored.

Organizational flexibility bears a value driver potential of on average 50 percent of the cost base. It accelerates business process innovations, manages a mix of global and local processes, flexibly handles In & Outsourcing and rapidly implements new business models. It gives greater freedom to choose the appropriate degree of centralization needed to support specialization or minimize process costs. Centralizing specific functions can generate higher value through better quality and productivity. Central master data management helps avoid double entries and inconsistencies; while supply chain planning activities benefit from a bigger pool of optimization objects.

Digital Management of Corporate Assets bears a value driver potential of on average more than 5 percent of the cost base. It generates new business insights, operates a scalable data model (processes, product lines, customers) and integrates views (financial and operational KPIs, internal and market data). As information becomes available at the micro level it allows companies to treat a single customer order as a profit center or a single process as a cost center. Aggregation of all these transaction results in much more accurate performance measurement of a specific customer, industry segment or location.

Capgemini offers a systematic Framework with five layers for Digital Transformation of Supply Chain Management:


On the top, layer 1, Digital Supply Chain strategy integrates digital initiatives into the overall supply chain strategy in order to generate and measure long term value. The identification of business benefits requires top management expertise and inputs regarding currently perceived pain points and industry best practices. Typical outcomes of an analysis of pain points are often broken processes, local instead of global optimization, low visibility, etc.

Supply Chain Operating and Governance Model, layer 2, helps realize the full potential of being a global company. It examines internal alignment of roles, procedures, service level agreements and transfer pricing schemes.

Integrated Supply Chain Performance Measurement, layer 3, uses Web 2.0 technologies to trace every order or transaction. Tagging technologies and virtualized data centers make information available. Combining this operational data with financial information, with external data from market and benchmarking efforts improves decision making.

Integrated Supply Chain Performance Management, layer 4, integrates the different supply chain functions such as product development, procurement, production, maintenance, and logistics across locations in order to minimize waste and non-value added activities.

Supply Chain Technology Architecture and Infrastructure, layer 5, provides the design logic for business processes and IT infrastructure, integrates and standardizes requirements of the organizations operating model. The challenge is to select and implement digital technologies and integrated platforms that employ reusable and exchangeable components with minimal investment in time and effort. Examples are RFID, wireless tracking devices, warehouse labor and vehicle management systems, voice-directed picking devices, etc.

Since 2005, I have been working with British Telecom’s Supply Chain Excellence practice. Auditing, evaluating and improving BT’s partners and suppliers I have witnessed the enormous opportunities to generate value for companies: Produce better products & services, respond to clients faster and more flexibly, develop effective eco-systems of partners, suppliers and customers…

Leading improvement efforts across all business functions and value chains, I could align structures & roles with strategies, optimized systems, processes, policies and procedures. Digitalization gives us now the tools and technical capabilities to take supply chain management to the next level – a global, truly holistic, agile, effective and cost-efficient, living and continuously improving ecosystem.



+++
To share your own thoughts or other best practices about this topic, please email me directly to alexwsteinberg (@) gmail.com.

Alternatively, you also may connect with me and become part of my professional network of Business, Digital, Technology & Sustainability experts at

https://www.linkedin.com/in/alexwsteinberg   or
Xing at https://www.xing.com/profile/Alex_Steinberg   or
Google+ at  https://plus.google.com/u/0/+AlexWSteinberg/posts