Why 5W?

5W is my chance to riff off the classic journalistic methodology of the “Five Ws” (‘Who What, When, Where and Why’) to describe a technology offering and potential impact to a business. Note that I’ve re-alliterated, co-opted, abused and reordered these to suit my evil purposes. I drop the commonly-added ‘How’, as I am seeking to provide prescriptive business guidance for technology selection and implementation. Happy to engage in online / offline discussions .. send me a note.

5W.

5W – Edge Computing

Thank you for reading! Please see “Why 5W?”  for context, methodology and disclaimers.

Edge Computing Overview

Edge Computing (EC) is a distributed computing paradigm where highly-scalable and efficient computing resources are made available closer (physically or through optimized network enhancements) to an end user, providing a superior application experience. EC offloads processing that would otherwise make round trips to a primary datacenter / cloud resource, thereby reducing latency and improving application / workload performance. Applications that can benefit from Edge Computing include mobile, web or thick client (locally installed on workstation).

The concept of ‘edge’ (in earlier days was known as ‘content delivery network‘ (CDN) as early as the 1990s. Companies like Akamai recognized a business model in providing improved end-user performance by serving static binary assets (images, videos and files) from geographically-dispersed data centers, closer to the end user than the central data center. Content and website owners uploaded large (videos, binary / file-based content) and a plethora of smaller, oft-used images (icons and web page graphics) to these servers to help the end-user avoid the latency of downloading these assets from central servers. Before the cloud, these servers were originally positioned in major data centers in the cardinal regions (AM, APAC, EMEA), but soon expanded into even more geographically-dispersed edge sites to improve end-user performance.

As you may surmise, companies who deployed assets on CDNs demanded policy-based content management (availability, expiry, etc.), unified upload / addressing for application access and consumption (storage, bandwidth) cost optimization from the CDN vendors. The winners in this early space sorted this through lookups, automation and dynamic addressing, methods deployed to this day.

Also to be expected, functional capability requirements evolved well beyond managing static content to include distributed interim storage, processing and expanded integration capabilities to third-party systems from the decentralized site, hence EC emerged. Early EC implementations were IoT-focused, especially for location-bound services that required increased computing performance to manage the output from sensors, machines and other monitorable / controllable assets.

The EC use case landscape is quite broad. Early deployments supported near-line collection and processing, enhancing data collected from IoT devices. These data were aggregated and correlated to provide a broader picture of the captured data. Next-stage deployments required highly-customized efforts by offering owners (i.e., a lot of code) to integrate these data. As with other emerging technologies, vendors stepped into the gap, providing platforms that reduce the level of effort (and code) enabling offering owners to engage in the new paradigm. Please see some of these in ‘Technology Providers’, below.

EC can be part of, and benefit from, a robust SD-WAN implementation, where devices, on-premises assets and cloud resource connections are managed and optimized centrally through policy. Most importantly, the benefits of an Edge Computing implementation can be measured by improved end-user / customer experience (CXM).

Edge Computing Business Benefits

Before modern browser-based and mobile-enabled capabilities, many applications were delivered to users in a client-server model, leveraging fast bandwidth and the computing capabilities of a local PC to do last-moment processing, dynamic presentation (sorting, graphics generation, etc.), in addition to relying on local temporary storage.

These applications were typically deployed in on-premises environments and connected to the central application server via a Local Area Network (LAN). Application developers / product owners relied on local PC capabilities to enhance the end-user experience for improved data navigation and management. When applications moved to browser-based implementation (with no access to local system resources), extended to mobile devices (multiple operating systems, introducing separate mobile-only code lines) or engaged with ‘dumb’ devices (devices lacking local processing, like sensors or cameras) the need for data aggregation and processing shifted back to the server. This introduced latency as entire data sets had to be transported over a slower connection for heretofore simple operations like sorting or graphics manipulation. As you may surmise, user experience suffered and bandwidth costs increased.

EC enables many of the same types of user experience enhancements through pre-processing content to thin-client implementations (web browsers, mobile devices, thin PCs and embedded endpoints), all without the need to write native client software for each of these devices. EC processing enhancements and close proximity avoid round trips to the central application server, leveraging capacity much closer to the end user, reducing latency and improving functionality. Further, EC enables call-outs to external services without the need to route them through the central application server, enabling application owners to leverage external services far more efficiently.

An EC DIY effort isn’t as easy or as seamless as an organization might hope. Establishing an edge infrastructure is beyond the typical product organization as it requires securing computing capacity close to your end users. Modern cloud vendors have enabled this capability by offering remote capacity in a pay-as-you-go model (please see ‘Technology Providers’, below). The short bit: aligning with an Edge provider with a platform is a far safer path.

Companies with existing applications encountered challenges getting to the edge as well, as, in short, many older applications were built with a single source server in mind. Companies discovered that off-loading tasks to remote computing capacity required refactoring application code or extra infrastructure into the new paradigm. While significant value can be achieved from this effort, companies had to balance the value against the investment. Again, modern cloud vendors and direct-to-edge platforms streamlined prompting application owners to review their existing applications to make them more edge-friendly / cloud-native applications.

EC is still early enough that many clear standards and verified strategies have yet to emerge. At a high level, there are some (emergent and hyper-simplified) EC offering methodologies:

  • Edge Capacity: IaaS-like computing resources enhanced with the means to perform automated deployments. Code must be written to be deployed across central and Edge servers, as with a distributed application.
  • Edge Platform: A PaaS-like platform that reduces manual deployments and code refactoring by enabling developers to ‘write to the platform’. This makes deployments simpler, but requires developers to refactor their code to Edge-Native standards.
  • Microservices: This option typically requires a significant re-write of application code, including provisioning services that enable standardized access to back-end resources. This is not for the faint of heart, nor those who do not have access to significantly-capable technical resources within their organization.

The first enables lift-and-shift code deployments, but at the cost of increased, remote management efforts. The second provides a far more robust and scalable implementation, but potentially creates vendor lock-in for the end customer. The third requires a code rewrite, all the way from the UI to the backend services. Note that in practice, an implementation may include any of these three, all combined into the final product.

As of this writing, there is a notable lack of codified EC standards and practices, with companies deploying workarounds to accommodate dated application standards and ‘not-quite-right’ edge toolsets. EC concepts are solid .. expect other paradigms to emerge as EC matures. The larger technology companies are the most mature, most notably, Akamai, IBM and Amazon.

Edge Computing Capabilities

The EC Platform paradigm solves significant application operability and performance use cases by enabling offering owners to get computing power closer to the end user. This is especially important when aiming to provide a thick-client user experience to cloud-hosted applications or aggregate and process data from thousands of IoT devices. While EC deployments will reduce latency, they will introduce levels of complexity in application design. As yet early, capable EC offerings may include:

  • EC as a Service: the edge platform is offered via provider-owned and operated assets, requiring little to no edge infrastructure on the part of the customer. This can be IaaS or PaaS. If the latter, will likely have one or more software platform components to which the application owner must comply.
  • Hardware Abstraction and Vendor-Independent Architectures: enables ‘Write Once / Deploy Broadly’ capabilities for developers to write code to a platform that can be deployed through automation, governed by policy.
  • Cloud Independence: EC intermediary systems can provide an abstraction layer to work with multiple clouds simultaneously or switch between clouds if  business needs / opportunities warrant.

Given the diversity of platforms and to-be-defined Edge standards, ISVs and SIs are filling the gap more so than Enterprises .. in short, organizations don’t really want to ‘roll their own’. This will shift as Edge-Native development paradigms become the norm for various application patterns. Open-Source platforms are clearly ahead of Edge development, albeit in a less-cohesive fashion. At the least, Edge-interested companies can cobble together an implementation from an impressive set of tools that will reduce their overall code and deployment challenges.

When defining EC capabilities, certain realities will come into play that will drive vendor selection and application considerations for organizations. In no particular order, these may include:

  • n-layer / n-tier, rather than flat, application architectures will lend themselves well to EC, especially if some of the layers / tiers are already defined as services-oriented implementations.
  • Services-oriented architectures (alluded to, above) enable broad application distribution, whereby services can be located and referenced to avoid duplication of code, capacity and effort.
  • Microservices architectures with policy-based deployment and connectivity models.
  • Configurable Process Orchestration.
  • Dynamic workload placement agility.
  • Aggregation and processing layers placed closer to data collection assets creating simplified inline real-time feedback to local monitoring assets.
  • Centralized processing tiers aggregating content for historical reporting.
  • Always-available connectivity, including 5G networks to ensure data can flow between tiers.

EC makes it possible to collect, process, augment and deliver data and content to end users. Besides improved user experience, EC enables thousands of real-time use cases that were simply infeasible before. Please see Use Cases below for examples of use cases that can benefit from EC.

Edge Computing Use Cases

EC lends itself to a wide variety of use cases, typically described as stories when engaging prospects. A few described below for reference.

Mapping and Directions:

  • An end user makes a request from a mapping application to obtain directions to their destination.
  • The request is authenticated by the main site, providing a processing token and redirection to the nearest edge site.
  • The token validates the request between the user device and the edge site and allows access.
  • The edge site processes the request independently, adding additional (and hyper-local) ephemeral assets like maps, weather, specific driving directions and more, sending the response to the mobile device.
  • The edge site performs lazy updates to the main site to update user history while the trip is in progress.
  • The end user receives the directions from the edge site, monitoring their progress locally via their mobile device.
  • While navigating, the mobile application updates the edge site and receives localized updates on traffic, weather or other local travel factors.
  • The user arrives at their destination and the edge site lazy updates user history on the main site.

To summarize: the main site offloaded parts of the services delivery to the edge site based on user location. The edge site performed localized outreach to manage changing conditions, informing the end user. The edge site updated the main site in the background during, finalizing user history at the end of the trip.

Image Acquisition / Manipulation  for Social Media upload:

  • An end user snaps a picture on their mobile device with the intent to upload it to their favorite social media site.
  • The request is authenticated by the main site, providing a token and redirection to the nearest edge site. The token validates the request between the user device and the edge site and allows access.
  • The mobile device opens the raw image file in the social media application, which is uploaded to the edge site. The main site is notified of the activity in the background, however the raw image is not uploaded to the main site just yet.
  • The user edits the image in the social media application (cropping, filters, enhancements, etc.). The mobile application prepares to send the delta of these changes (not the image file itself) to the edge site.
  • The user confirms their changes and when satisfied, commits them on the mobile device.
  • The edge site dynamically allocates computing capacity to manipulate the image to user specifications.
  • The edge site uses computing capacity to finalize the image, notifying the user when complete.
  • The edge site lazy writes the final version to the central server for permanent storage.

The main site offloaded processor-intensive image manipulation tasks to the edge site. Note the image was only uploaded once, and only to the edge site. Only after the end user approved changes did the final optimized image be uploaded to the main site for permanent storage.

Other use cases can be discussed:

  • Location-based Interactive Gaming like Ingress and Pokemon Go.
  • TMS: Vehicle Monitoring and Management through processing telematics. 5G enablement opportunity.
  • Manufacturing: ‘Smarter’ Sensors for Monitoring and Management.
  • “Smart Buildings” Monitoring and Management through embedded systems.

EC use cases number in the thousands. No customer buys ‘Edge Computing’ by itself .. they will have at least one (and likely several) use case goals in mind. It is the seller / SME task to understand the target use case proposals to the point where we can suggest appropriate solutions from our client offerings. Note that while adopting a platform can satisfy a priority use case, the same platform can be used for additional use cases to achieve more ROI from an end-customer investment.

Edge Computing Providers

As use cases vary far and wide, these are all ‘some assembly required’ .. where these companies provide the hardware and / or software platforms requiring connecting and orchestrating systems and services. It is highly unlikely widely-accepted standards will emerge across all platforms. 

VendorProductNotes
AkamaiEdge Computing PlatformA mature Edge platform with server-less code execution, deployed on the Akamai global CDN network. “Edge Computing” is the new Akamai marketing tagline. Offers a free trial.
AmazonLambda@Edge
IoT GreenGrass
A feature of Amazon CloudFront, Lambda enables serverless code execution closer to the end user. On-demand capacity, AWS does not charge the customer when the code is not running.   IoT-specific use cases for connectivity and management.
CiscoEdge Computing SolutionsRouter-reliant with a small footprint.
Dell EMCProject Frontier Hardware-centric offering. VMware is a partner. 
Ericcson Edge Computing Telecom-centric EC platform. 
HPEEdgeline Converged Edge Systems
HPE Greenlake Edge
Hardware-centric offering.      Includes locally-deployed resources on a PAYG basis.   Edge Computing Resource Library.
IBMCloud at the Edge A bit of marketing oatmeal .. repurposing a combination of IBM Cloud Services and Middleware. Despite the marketing goo, IBM can definitely do Edge. 
IntelEdge Technology Includes Intel Market-Ready Solutions, pre-built IoT kits with Edge capabilities. 
MicrosoftAzure IoT Edge    Azure Stack EdgeA fully-managed service built on Azure IoT Hub.    Locally-deployed Azure component, extending to the Azure cloud.   Article: What is Edge Computing
openstackopenstack Edge ComputingOwned by IBM.
StackPathBuild your EdgeSecure Edge Platform for Developers, which enables the deployment of Edge-Distributed applications.

Edge Computing Audiences

Most IT audiences will recognize Edge by name .. but all will have different definitions and a wide variety of understanding and expectations for Edge use in their organization. Ditto for Executive audiences. The majority of these audiences will need to be engaged through education on the performance and cost savings aspects of EC, tailored to their role in an organization. IT can be engaged through potential cost savings where an offering can be decentralized, shifting processing to remote locations, saving processing and bandwidth.

Surprisingly, Product Owners (POs), who are typically hungry for new features to improve their products are not seen as primary audiences for EC as of yet .. the PO audience must also be educated, relying on ‘what if’ scenarios and stories that will improve their offering. Given virtually infinite use cases, engaging POs through discovery is an important part of an engagement. A seller can approach prospective POs by describing the value of EC to add new features and enhance performance to end-customer audiences. This will require sellers to take a close look at the company offering, understanding the use cases the offering enables and extrapolating offering enhancements they can present to the PO. Not all ideas will land with POs .. this engagement will require thick-skinned ‘idea hamsters‘ working with outreach agents to ensure a credible impression in an initial contact.

EC is not a technical sale at the outset. It is creative and educational, relying on sellers identifying, researching and offering use cases that align to prospect roles within the target organization.

A company should consider EC for:

  • Adding new features to their offering that can be optimized via the end-user location.
  • Enhancing the performance of their offering.
  • Cost savings through application optimization (smaller centralized footprint, reduced bandwidth, etc.).
  • Improving end-customer experience with new features and better performance.
  • Localizing customer experiences.

A seller needs the ability to recognize, expand and document end-customer use cases / need states that enable them to secure a solid prospect.

An EC sale will cross multiple audiences, engaging PO, IT, Operations, Developers and Executive audiences.

Conclusion

Edge Computing is a storytelling campaign. It may not begin with a prospect who has an enhancement story already in mind. As noted above, no company will buy ‘edge’ .. they will consider new features, better customer experience, faster performance, application stability, cost savings and so on. Edge Computing is a paradigm shift .. engagements will involve several technical and business audiences, as well as reaching from developer audiences to the executive suite of an organization. Product Owners and Developers will present as useful influencers, but the ultimate decisions must be recognized across the organization.

5W – Customer Data Platform (CDP)

Thank you for reading! Please see “Why 5W?”  for context, methodology and disclaimers.

Technology Overview

Customer Data Platforms are recent aggregation / expansion capabilities realized through the integration of customer interaction support software, enabling a company to compile a 360-degree view of customers and prospects. Armed with this information, Marketing and Sales can make relevant and personal connections to prospects to increase conversion rates.

Companies will recognize the usual suspects of potential component systems that can contribute to a CDP effort:

  • Social Media Engagements
  • Marketing Automation Platform (MAP)
  • E-Commerce Systems
  • Enterprise Resource Planning Systems (ERP)
  • Customer Relationship Management Systems (CRM)
  • Customer Support Systems
  • Customer Experience Management (CXM)
  • Warehouse Management Systems (WMS)
  • Order Management Systems (OMS)
  • Fulfillment Management Systems (FMS)

CDPs provide a configurable means to connect, aggregate and combine data across disparate systems guided by business-facing definitions from Sales and Marketing. Once deployed, a CDP should require minimal IT intervention, provided the CDP enables business users the capability to combine and view real-time data to test and confirm hypotheses on their own.

Note that CDP integrates with on- and off-premises systems and cloud providers. CDP is an emerging technology use case and will require education and nurture efforts to engage end customers.

Business Benefits

Companies store customer data and interactions in an ever-growing variety of disparate systems, most of which are marginally, or not connected to each other at all.

As an example, Marketing and Sales recognize the value of having the latest customer transactions, preferences and communications data available to them, along with current inventory levels, order and delivery statuses, account balances, credit status and so on. While these data can be combined through manual system search and interaction, a manual effort is time-consuming and represents risks, not only security (through multiple system exposure to out-of-department audiences) but contextual data inaccuracies (relating the wrong data to transactional events, ensuring the data is up to date, and so on). CDP helps to manage this by making connections through configuration .. not coding, automating data retrieval, assembly and presentation of the most current data to the business user.

Determining which data to combine requires visibility into the business processes and customer / prospect interaction goals of the target audiences .. the people who can benefit the most from aggregated business data. Note that all companies have these interaction goals, but they may not necessarily be stated, reviewed, aligned or published. Further, helping non-technical audiences understand the business benefits of CDP in their environment will require education, mostly through use-case based engagements, where offering sellers present use cases demonstrating how a CDP can enable business benefits in their environment.

In short, a CDP combines these data, enabling thousands of business of use cases to improve customer service, accelerate conversions, document performance and more. Some examples are presented in Use Cases, below.

CDP Capabilities

To play in the CDP space, a vendor offering must:

  • Provide configurable connectivity to organizational systems, whether cloud, on-premises, database, LOB, etc.
  • Collect and transform data from connected systems, storing as necessary (not all systems will allow real-time access, nor may it meet necessary performance requirements to do so).
  • Create / Associate Data Consumer identifiers across these systems to surface customers in context.
  • Define:
    • Consumer profiles and groups, assigning appropriate permissions and workflows.
    • Internal audiences, role-based access control and the depth of data to view when published.
  • Segment Consumers into meaningful ‘buckets’ for workflow activities and content distribution.
  • Provide the capability to switch on / off connections across systems as business needs / opportunities emerge .. without IT assistance.
  • Manage and expand Consumer / Audience profile over time to improve views and visualizations.
  • Ensure connection and end-to-end security / privacy / auditing of acquired / stored data.
  • Publish Customer and Audience Data across tools for consumption by appropriate audiences within an organization.

Through these capabilities, a single source of truth will emerge, along with enabling a 360-degree view of a specific customer or group. This increases confidence within the consuming audiences and empowers them to make better, data-driven decisions when working with customers.

Some examples are presented below.

Use Cases

While each company will have specific use cases to fulfill, many aspects will fall into semi-standardized buckets. The real impact of CDP is in connecting across systems, enabling out-of-application audience users to see data from other systems in a controlled and secure manner.

Some samples to get creative juices flowing .. the sub-bullets in the first example are the discrete steps and systems with which a user would engage manually to create the outcome:

  • Customer Support – Customer Complaint: Address a product complaint and create a CRM Task, assigning to a Customer Service Agent for resolution. The source of the complaint could be a call, a tweet, an email, an online form, etc. The CDP can perform the following discrete steps:
  • Look up the customer ID in the customer database (note this could be in the Customer Support CRM or in a separate Customer CRM). This may require the Twitter handle, email address or customer name to identify the customer.
  • Retrieve the customer invoices from the Invoicing System using the Customer ID.
  • Identify the proper invoice based on the product in question.
  • Confirm the product was ordered in the Order Management System (OMS).
  • Confirm the product was shipped in the Fulfillment Management System (FMS).
  • Identify if the product is still under warranty and can be replaced.
  • If the product can be replaced, check the Warehouse Management System (WMS) to see if the item is in stock and can be shipped.
  • Create a task containing the data above in the Customer Support CRM and assign to a Customer Support agent.

Executing these steps enables a Customer Support agent to reach out to the with a specific resolution to the customer issue, the ability to address the issue directly and provide a solution in the first contact with the customer. This level of personalized interaction ensures customer satisfaction.

A CDP provides connectivity and automation, designed and configured by a business analyst (not IT) for each of these tasks. This enables customer- and agent-friendly resolutions to customer support issues. Further, the data collected across all these systems provides massive Business Intelligence opportunities for an organization.

The remainder of these use cases are referenced right-to-left, without the intermediate steps. A CDP offering should be sold by presenting use cases that resonate with a target customer, likely leading to a Discovery Call.

  • Marketing – Improve Marketing Orchestration: Capture feedback in real-time across all channels (MAP, Sales, E-Commerce) by collecting data aligning to MAP outreach to determine marketing effectiveness.
  • Marketing – Improve Marketing Communications: Capture real-time feedback from marketing campaigns as potential customers are engaging with marketing content. Marketing can gauge which messages are more effective.
  • Marketing – Perform / Streamline A/B Tests: Drive marketing moments / capture the results of product enhancements in real time by initiating multiple (two or more) actions simultaneously, capturing the results and feedback in real time to determine which combination of messaging and the channel is more effective.
  • IT Governance – Improve Customer Data Quality: One source of truth .. the process of connecting and correlating disparate data systems enables CDOs with a solid foundation of tracking internal audiences with access, and controlling the data these audiences can see.
  • Operations – Improve the Customer Journey: Integrate with CXM and MarTech systems to isolate and clarify how a customer got into the stages .. how quickly and how effectively .. all the way through conversion and fulfillment.
  • Sales – Current state of the Customer: Present an all-up customer view for a seller on their way into a customer meeting. Captures content across systems: confirming the latest orders have been shipped, invoices have been paid, customer support issues have been resolved, etc., all prior for the seller arriving in a face-to-face meeting with the customer.

To assemble the data In all of these (and other) use cases, internal systems must be accessed by out-of-audience entities, which, if performed manually is time-consuming, creates security risks and could be filled with outdated / erroneous data. A CDP standardizes these interactions and provides a unified view, by audience for these data.

Hidden Gem: Besides enabling Operational and Business Intelligence opportunities, a robust CDP implementation creates a framework for Data Governance use cases.

CDP Providers

A great many companies claim CDP capabilities (CDP Vendor Segment cites ~80), but most are add-ons to their existing LOB systems and not ‘pure’ CDP plays. These include the bulk of the larger ERP companies, as well as more than a few Integrated Platform as a Service (iPaaS) / Hybrid Integration Provider (HIP) vendors.

Companies making a CDP claim run the gamut of functionality from configurable connectivity, accessing in-flow data streams, reporting and / or creating persistent, referenceable repositories to ensure interactivity and access to aggregated data. A partial list follows, where only ‘pure’ CDP plays are included:

CompanyNotes
ActionIQ“An Enterprise-Grade Solution With the Agility of a Startup”. Integrates with MAP and Analytics tools.
BlueVennPresents as a unified customer data management, compliance and customer journey orchestration platform, including analytics and machine learning to drive personalized engagements.
ExponeaAcquired by Bloomreach. Cites “Sell To Customers, Not Sessions”, which references session-based engagements with E-Commerce sites. Solid E-Commerce retail focus with integrated MAP capabilities and a number of MAP- and data-centric integrations. Offers an Exponea Overview video (three-minutes).
LyticsFocuses on using CDP to personalize experiences for every step in the Customer Journey. Includes several customer-centric us cases, including acquisition, engagement, upsell, win-back and renewal. Integrations for the retail, CRM and MAP spaces. Looks more like enlightened MAP.
MaropostExtends MAP with CDP to amplify email marketing and E-Commerce personalization capabilities. Looks more like enlightened E-Commerce
mParticleFocuses on CXM and improving the Customer Journey. Rich integration framework and SDK to support beyond MAP and CRM. Offers a 60-day services engagement to get started and a Gated Demo.
SegmentClaims specialization in several verticals and platforms (Retail, Mobile, B2B, Marketplace and Media). Largely DIY. Offers a no-cost Developer tier.
TealiumTealium manages connections, standardizations, transformation, integrations and activation (delivery to devices). Claims 1,300 client- and service-side integrations offered in the Tealium Integrations Marketplace.

Frequent updates to this section as new vendors emerge.

ED: As Article Publish dates are frozen in time, it is quite possible reviewed vendors and their capabilities may have advanced beyond those presented herein. Please accept my apologies for my shortcomings. A note to vendors (old, evolved and new): please reach out with current offering capabilities and I will update the list.

CDP Audiences

A few enlightened prospects will recognize CDP by name, slightly fewer will recognize it by the integration opportunity CDP provides. As per, the majority will need to be engaged through education, tailored to their role in an organization and the benefits of exploring CDP for business benefit.

Most customers will not recognize an immediate need for a CDP, likely citing their CRM or Customer Support Systems present these data. To a point, they’re accurate. These systems provide the present customer / prospect state to the primary audience of the system .. but not to the secondary audiences who could benefit the most from knowing all the data about their customer.

CDP connects a wide variety of business platforms creating virtually infinite use cases, so discovery is an important part of an engagement. A seller can approach prospective CDP client by describing the value of CDP integration to create Business Intelligence Use Cases, enabling a seller to map a story to prospect systems do demonstrate business value, including:

  • Value of configurable system integration and the virtuous cycle enabled thusly.
  • Data Governance (hidden gem .. solves two business challenges with one implementation) benefits, where CDP exposes access and management control opportunities across systems.
  • Dashboards and Beyond, delivering relevant, interactive and secure information to the proper people at all levels in an organization.
  • Agility and Flexibility, enabling Citizen Analysts / Marketers to review data that is most relevant to them.
  • Speed to Business Value: data-driven business benefits can be realized with the first connections between systems.

Important note: CDP is not ‘rip and replace’ .. CDP integrates with incumbent systems, databases and file-based repositories.

CDP is not a technical sale .. it is creative and educational, relying on sellers identifying, researching and offering use cases that align to prospect roles within the target organization.

As the CDP space is early stage at the time of this writing, opportunities within are mostly greenfield .. but will require discovery, education and nurture to make an effective case to decision makers in a target organization. CDP prospects will already be combining some of these data, albeit manually. It is important to demonstrate the value to the specific audience in time savings and improving the quality of data presented to users.

Note there will be a fair bit to unpack as customer environments will contain incumbent systems, most of which store customer data. On this, please recognize that access to any systems containing customer data must be managed appropriately, as GDPR and CCPA constraints may be in play. A seller must inventory the present state to realize the scope of the existing environment, recognizing incumbent systems and integration opportunities therein .. all while pointing out how CDP will manage any potential privacy breaches.

 A company should consider a CDP for:

  • Consolidating / presenting an all-up view of a customer in a single place, securely, and to relevant audiences.
  • Capturing a more granular understanding of customers.
  • Consolidating / switching analytics or marketing systems / vendors.
  • Personalizing customer experiences.
  • Unifying messaging across all channels.

A seller needs the ability to recognize, expand and document end-customer use cases / need states that enable them to secure a solid prospect.

Primary CDP audiences include:

  • Marketing
  • Product Owners
  • Customer Support Organizations
  • Sales (capturing an all-up customer view)
  • Analytics

Expect these audiences to expand through digital-driven CDP engagements, which will surface hand-raisers and other interested entities within end-customer organizations.

Primary Targets for CDP implementations are not necessarily aligned to specific industries. Targets include companies who:

  • Have connections between ‘some’ systems across ‘some’ non-primary audiences. This will manifest itself with companies that have connected MAP and CRM systems to power active outreach. Note that Sales and Support are left out, presenting the opportunity for us to advance use cases when other systems can be connected. Treat this as gap analysis, where other systems can be brought in to improve customer context and relevance.
  • Can recognize the business value to be gained by connecting disparate systems. This is use-case specific, and should be workshopped with prospects to demonstrate value.
  • Are expanding from tribally-supported Sales / Marketing use case feedback loops to their audiences, but don’t expose content to audiences beyond the bits a data consumer already ‘knows’ about their customer base. It’s important to advance the concept of exposing these data to identify gaps in the customer knowledge base.
  • Seek to formalize an approach to exposing end-customer data across multiple, incumbent systems, mapped to specific internal audiences. This represents a longer sales cycle, but enables a seller to demonstrate expertise in the space.

Again, CDP is in the educational phase. CDP is a combination of existing systems and use cases .. many of which companies may be doing manually on their own. CDP provides the connectivity, automation, data aggregation and presentation layer to get the right data to the right audiences at the right time.

Conclusion

CDP provides an interesting and accessible means to integrate across multiple systems, securely exposing integrated data within a single interface through low-code and configuration, versus code, multiple logins and reporting capabilities requiring manual manipulating to gain business value.

5W – Understanding Quote-to-Cash (QTC)

Thank you for reading! Please see “Why 5W?”  for context, methodology and disclaimers.

Quote to Cash Overview

The Quote-to-Cash process (also known as Q2C or QTC, the latter I am using as the acronym in this article) is a set of business processes that includes product / offering selecting, quoting, pricing, up-sell, contracting, invoicing, integration with payment systems, contract renewal and more. We should all recognize that reducing friction after the sale is a critical component to engaging delivery, achieving revenue targets and getting paid. In an ideal world, QTC would be automated and heavily integrated with all component systems and dependent systems.

Given the importance of streamlining revenue-associated business processes, it is surprising to note that many companies have not effectively automated or standardized their QTC processes. While some businesses have standardized some of them .. others have selected or inherited multiple SaaS-based systems over time, and still others rely on separate processes. However, there is a significant lack of end-to-end integration for QTC in most businesses.

You might also have heard of Order to Cash (OTC). While the terms may be used interchangeably, OTC and QTC are different:

  • OTC likely does not include configuration capabilities for the product / offering, like setting the price, creating a quote, or any bits associated with Contracts and Contract Renewal. OTC tends to be at the order level, where these items are simple (quantity, color, etc.), or already identified.
  • QTC encompasses a much larger set of business processes and as noted above, may well include OTC bits for scoping, pricing, logistics and delivery.

An effective QTC implementation will be tightly bound with company Enterprise Resource Planning (ERP), Configure Price Quote (CPQ), inventory and Point of Sale (POS) Systems for brick-and-mortar establishments. If the organization transacts business online, an E-Commerce System acts as POS, and a Customer Experience Management (CXM) System can provide significant Business Intelligence (BI) value by tracking customer interactions on the site, capturing buying patterns and interest in other products for future cross-sell / up-sell outreach.

If the organization lacks native (or effective) integration across systems, it can consider deploying Robotic Process Automation (RPA) as an effective stopgap to connect other systems to a QTC system while integrations are built.

 
QTC Business Benefits

QTC falls on the right-hand side of a Supply Chain Management (SCM) diagram, where an end customer has selected / described a finished product / offering and is entering purchase / contract / services negotiations. Note that in a retail E-Commerce transaction, there is typically little complexity on this: select the item, customize, click the quantity, enter payment and shipping information and the product shows up. For more complex offerings, scoping, customization, integration with external systems, current-customer discounts and much more will require much more detail when finalizing an purchase.

Consider QTC enabling / automating a complex set of transactional steps, including:

ActionActorNotes / Opportunities
Product / Offering SelectionBuyerGuided by Seller / Automation
Configuration / ScopingBuyer / SellerGuided by Seller / Automation
Configure Product / Offering / Services Price Quote (CPQ)SellerSeller Document Workflow
Approve / Reject CPQSeller ManagementIterative; should be automated
Create ContractSellerSeller Document Workflow
Provide Approved Quote and Contract to CustomerSellerSeller Document Workflow
Negotiate ContractBuyer / SellerIterative; requires multiple entities.
Sign ContractBuyer / SellerLogistics: ‘wet’ versus digital.
Check Customer CreditSellerExternal System
Place OrderSellerSeller Internal System
Create InvoiceSellerSeller Internal System
Send InvoiceSellerSeller Internal System
Pay InvoiceBuyerBuyer Internal System
Revenue RecognitionSellerSeller Internal System
Contract RenewalBuyer / SellerTypically outside the QTC scope.

This list is not complete and will vary widely across companies.

QTC Capabilities

Many E-Commerce transactions automate the early stages of this (Product / Offering / Configuration / Scoping, some quoting), but as the bulk of these processes involve disparate systems (including internal systems like Contracting, Financial and Warehouse Management), E-Commerce can (and should) only go so far in the overall QTC scope.

QTC transactional steps touch multiple systems in the Seller environment:

  • Order Management
  • Configuration Management
  • Quoting (CPQ) Management
  • Contract Lifecycle Management (CLM)
  • Credit Management
  • Warehouse Management (WM)
  • Accounts Receivable (AR)
  • Payment Management.

.. and so on. Automation is the first step, but integration is the real key to accelerating the entire scope of QTC to business benefit.

The QTC process is comprised of discrete steps, a subset of which are described above. A QTC system that integrates, automates and governs transactions, including policy-based logic, approval routing, notifications, and providing an all-up process view is a major benefit to an end customer. A quote from the Aberdeen Group (provided by BlackCurve) cites QTC ROI is significant, resulting in:

  • 105% larger deal sizes.
  • 49% higher proposal volume
  • 28% shorter sales cycle
  • 26% more sales representatives meeting their targets
  • 17% higher conversation rate

These benefits are largely due to streamlined, interconnected and error-free automated processes. Please note the numbers above were extracted from QTC systems across multiple sales organizations and across all industries. End customers should take a serious look at QTC to achieve this kind of reporting alone .. most organizations can search across systems to validate these findings in their present environment.

Addressing the automation and interconnectivity use cases in the ‘Business Challenges’ section above will aid in the selection of a QTC offering, as each represents a checklist item to be confirmed in an end-customer environment during discovery.

Organizations with robust Supply Chain Management systems will recognize the opportunity to improve in-chain payment processes by expanding the scope of the QTC process to include Trading Partners for whom they are a Provider or a Seller.

QTC Use Cases

QTC is effectively its own use case, speeding the process between initiating, configuring, pricing and closing on a sale to receive revenue. Granted, you’ll note a lot of complexity across a large number of systems that describe a complete solution.

QTC Providers

The major ERP, CRM and SCM players have a QTC offering or integration / implementation:

CompanyFeatures / Notes
AcceLIMPrimarily visibility and integration tools.
AccumaticaERP with CPQ and QTC integration.
ApttusCPQ, contract management, E-Commerce and revenue management. AI- / ML-enabled.
Callidus SoftwareCloud-based sales, marketing and customer experience. SaaS-based, with flexible (subscription, services and licensing) pricing options. Owned by SAP.
CapgeminiCPQ and Contract Lifecycle Management (CLM) with QTC integration.
Cincom SystemsCPQ with QTC integration.
CloudSenseCPQ, OM, CLM and E-Commerce built atop Salesforce.
ConnectWiseCPQ, RPA and QTC integration.
DataServPrimarily FPA with QTC integration.
DocusignThe DocuSign Agreement Cloud is a significant expansion to their online document management capabilities, which now include CLM.
experlogixCPQ and OM for Salesforce and Microsoft Dynamics CRM.
FPXExperience Management solutions for Enterprises, providing CPQ for Sellers and Buyers.
KBMaxCPQ with Salesforce integration (Epicor).
OracleThe Oracle CPQ Cloud is tightly integrated with its CXM suite.
SalesforceCRM provider. Acquired SteelBrick in 2015. SB built QTC as an SFDC application.

ED: As Article Publish dates are frozen in time, it is quite possible that reviewed vendors and their capabilities may have advanced beyond those presented herein. Please accept my apologies for my shortcomings. A note to vendors: please reach out to update your current offering capabilities and I will update.

QTC Audiences

A QTC engagement is unlikely to be greenfield, as some automation, on-premises or SaaS silos will likely exist. It will also cross multiple audiences within an organization, including:

  • IT
  • Contracting
  • Quoting
  • Finance (AR / AP)
  • E-Commerce Platform Owners
  • Logistics and Fulfillment Roles

.. and all the way to the executive suite.

A modern QTC offering should have no issues connecting with modern SaaS silos via API or RPA. This is a solid benefit to call out as end customers will be rip-and-replace averse when engaging with critical legal and financial systems. Initial engagement will include:

  • Discovery of the current order, fulfillment, payment systems, plus business workflows.
  • Capture integration points (by name and integration method).
  • Document the present workflow as part of solution planning.

Some reluctance will surface as many will cite some of their processing components are working well enough to suit their needs. This posture will manifest itself early in a conversation, especially among operators of ‘in the chain’ components. While a company-wide QTC effort will be driven from the top, there is still value in engaging with operational entities within an organization to ascertain process, performance and contract intelligence.

Common Customer Objections include:

  • QTC is expensive: Cost is relative, and like any investment in business technology, a customer must examine the ROI of the purchase against the potential value the technology and disciplined business processes will bring to the company. There are many SaaS-based QTC offerings with robust integration points offering process improvement without a rip-and-replace requirement.
  • QTC is complicated: The short version: it is. As QTC requires discipline both in process management and integration, suggesting a dedicated resource or a partner to assess and execute. QTC is connecting, configuring and NOT coding in over 75% of the use cases, and creates huge time savings when deployed. It is not all sunshine and roses .. if systems that cannot be integrated (siloed, lacking APIs or are paper-based) are in the process, you may not be able to eliminate some manual steps in the first iteration.
  • QTC is overkill: Many organizations may feel they’re too small or not complex enough for QTC. Tease this out in Discovery .. if the customer has a quoting and approval process, they will benefit from QTC capabilities.
  • We lack the bandwidth to deploy QTC: See ‘complicated’, above. There are companies who can assist with this.
  • We already have QTC: I appreciate it when an organization has an existing implementation as it results in a shorter education cycle and lets us engage more directly. Through Discovery, unpack the extent of their implementation, any components are getting old / need updating, or don’t play well (integration or reporting) with others. You may also be able to expose opportunities for QTC within their SCM implementation, as noted in ‘capabilities’, above.

Not an objection, but a QTC implementation isn’t typically do-it-yourself. Most organizations will recognize this, so it is important to have engaged Partner Sellers to provide Services in these engagements.

Conclusion

You may surmise QTC is a less common implementation in companies and not likely a greenfield solution. Companies will have manual or semi-automated processes, each with a process discipline. QTC is an enabling technology within E-Commerce, Logistics and Fulfillment platforms, so industry / vertical is less important than how the customer transacts business.

Some offerings are more end-to-end than others .. but note: as flexible Integration is a requirement for a QTC offering, an end customer can select and right-size a solution that has just the features they seek at a price point they can afford. Most of these are SaaS-based, allowing for a low barrier to customer entry and flexible entry / escape paths.

5W – Data Governance

Thank you for reading! Please see “Why 5W?” for context, methodology and disclaimers.

Overview

Data Governance (DG) makes its way quickly into the ‘bad word pile’ with most organizations, especially as they gain visibility to the responsibilities that fall to them when managing user data. Note that more and more RFPs are asking DG questions as part of their compliance when considering vendors.

The scope of DG is massive, encompassing Personally-Identifiable Information (PII), Personal Credit Information (PCI), Personal Health Information (PHI), inferred combinations of the above, PLUS business-sensitive data for both the company and their customers. ‘At Work’ data is still considered PII by regulatory organizations (opinion by the N3 legal team when referencing GDPR requirements for the EU), so must also be considered for DG policies and practices.

Last, data is no longer tucked safely behind the firewall of a company data center. Companies must take an integrated / hybrid approach to discover, catalog and manage data from a wide variety of sources.

A wrinkle for companies who manage client data: A company must maintain a custodial posture, thus closing the loop to acquisition, enrichment and return of client-owned data. DG is critical to these hand-offs, and must be managed effectively through policies, audited practices and documented enforcement should breaches occur. Part of this wrinkle is ‘data ownership’: where an organization acts as a custodian of client data for a duration, making the organization beholden to provide client notification of breach and assurances of compliant handling of these data.

What is Data Governance?

As it turns out, there is an institute for DG, cleverly named The Data Governance Institute. They’ve been kind enough to define DG, thusly:

Here’s a short definition of Data Governance:

“Data Governance is the exercise of decision-making and authority for data-related matters.“

Here’s a bit longer definition:

“Data Governance is a system of decision rights and accountabilities for information-related processes, executed according to agreed-upon models which describe who can take what actions with what information, and when, under what circumstances, using what methods.”

DG encompasses more than the data itself; it also (in context), refers to:

  • Organizational bodies
  • Rules (policies, standards, guidelines, business rules)
  • Decision rights (how we “decide how to decide”)
  • Accountabilities (and auditing)
  • Enforcement methods for people and information systems as they perform information-related processes.

Note that an IT department (as an identified organizational body) recognize DG as a necessity, but tend to view DG as a ‘lose-lose’ proposition for themselves. The first lose: no control, no oversight, no consequences .. IT just hopes for the best. The other lose: opening the hood and recognizing how data is currently protected, how transit is documented, encryption is enforced, access is audited, and that policies comply with regulations affecting these data. In the latter case, IT has to do something about it as knowledge equals disclosure (and management of same).

As a result, IT should not be in charge of DG policies, but should rather be governed by corporate-defined, enforced policies, practices and documentation for the handling of these data. IT needs to be involved: they can provide significant insights into data repositories, current practices and tribal knowledge of data acquisition history.

Data Governance Benefits

The news isn’t all bad .. there are two primary benefits of a solid DG program for a company:

  • Compliance
  • Business Intelligence / Insights

Let’s first recognize Compliance as a ‘need to do’ to safeguard a company against regulatory agency action for lacking established practices that demonstrate the intent to comply (legal citation needed .. But this advice won’t create happy moments .. this is not an easy bullet to dodge). First line of defense: documented intent to comply with the appropriate agencies .. second line of defense: documentation describing DG practices for compliant handing of data. The third, implementing and enforcing same.

With that said, it is always better to demonstrate corporate practices that ensure employees will comply with regulations than a ‘head-in-the-sand’ approach to DG. Think of this as an ‘80% Approach’, where a company has a defensible position (roughly) equal to manage 80% of the potential regulator impact. Note: Much of this is employee training and employee acceptance of policy (which can be delivered and captured via a few mouse clicks), but ultimately, violations will impact the corporation, based on procedure enforcement and not the errant employee. One last bit: Corporate policy regarding data handling must be documented and activities enforced.

Data Governance for Compliance

Discover, classify and manage information in ways that meet the obligations enforced by both regulatory and corporate mandates. Some use cases include:

  • Regulatory Agencies
  • Privacy & Protection
  • Records & Retention
  • e-Discovery
  • Audit Readiness
  • Archiving

The BI side of the equation could provide the business and revenue value that funds the compliance costs .. in a perfect world, of course, so let’s see where companies can benefit through surfacing their data BI value as Insights.

Data Governance for Insights

Provide safe access to trusted, high-quality, fit-for-purpose data while handling effective collaboration among team members through:

  • Data discovery and cataloging
  • Self-service access to data and analytics for business users
  • Managed trusted repositories

Think in terms of accessing CRM, ERP and Supply Chain systems programmatically, enabling Business Analysts to surface a 360-degree view of a customer interaction within a company.

Steps to Data Governance

Courtesy of Tealium and their Universal Data Layer product, here are five discrete steps to securing DG; each of these steps has a business and technical requirement:

Due Diligence: Audit Data Flows to know where and who has access to data:

  • Business Audience:
    • Identify vendors in use
    • Validate vendor access
    • Review current contracts
  • Technical Audience:
    • Audit vendor technology
    • Review vendor policies
    • Remove non-compliant or unused vendors

Perform a data inventory to understand data types, how data is processed and requirements:

  • Business Audience:
    • Agree on data sensitivity both from a legal and experience perspective (taxonomy)
    • Agree on the data needed to run marketing vs. operations
    • Document data requirements for running the business
  • Technical Audience
    • Audit vendor technology
    • Review vendor policies
    • Remove non-compliant or unused vendors

Build Controls: Develop procedures to provide clear and accurate notice of data usage both internally, with policy and process, and externally, through notification, terms and conditions:

  • Business Audience:
    • Verify proper contracts with vendors
    • Create governance policies and processes
    • Update external and internal communication
  • Technical Audience
    • Configure vendors for ‘least-access’
    • Create data audit guidelines and tests
    • Test and audit internally for compliance

Form a Data Governance Panel: Activate against internal processes for both business and technology teams to move forward.

  • Business Team Communicates with Technology team on:
    • Needs to drive marketing and customer experiences
    • Legal ramifications of non-compliance
    • Expectations of the business on technology
  • Technology Team Communicates with Business team on:
    • Best practices with access, transmission and storage of data
    • Protection of the data and the customer from ‘bad’ players (Internal, External, Partner)
    • Enablement of the business within reason

Provide Clear and Accurate Notice: Communicate your data policy across the organization, and to customers and vendors:

  • Business Team
    • Update Privacy Policy to reflect data usage (ex. cookie policy, IP usage)
    • Provide means for opt-out across all marketing
    • Communicate with Technology team on evolving data usage
  • Technology Team
    • Provide customers with Explicit Opt-In/Out
    • Ensure ‘Right to be Forgotten’ and general data deletion directives
    • Communicate to Business team and vendors of compliance changes or lack of compliance

Data Governance Providers

Lots of companies are more than happy to provide templates that ensure DG compliance (primarily; given the need to manage GDPR, most supporting companies are grabbing the low-hanging fruit).

Further analysis required if we choose to use a vendor to speed our way to Data Governance.

Conclusion

The path to Data Governance is not an ‘if’ decision for companies; the need to address DG as a ‘when’ and ‘how’ initiative, providing executive support to motions to satisfy clients and regulatory agencies. Note the number of RFPs that include DG questions is rapidly increasing, and the number of DG regulations out here, most immediately, the GDPR in the EU.

%d bloggers like this: