I agree We use cookies on this website to enhance your user experience. By clicking any link on this page you are giving your consent for us to set cookies. More info
Cheryl Stevens, Director, Enterprise Applications - Enterprise Data Management, Veolia North America
In today’s business world, a key differentiating factor is a company's ability to adopt an effective digital data strategy into their Business Processes Management initiatives. The lines between technology and business process partners are converging. A deeper understanding of the process improvement opportunities will equip leaders to craft the best and most effective strategies. It is now easier than ever to be left behind if business process owners are resistant to fully capitalize on these new technical opportunities and partnerships. The most important component of a strong digital strategy is striking the balance between speed and scale. This balance will require both technical and process components.
New cloud data technologies are key to moving the business forward quickly. A blend of a Lake and Hub models seems to be one of the preferred architectural designs, and for good reason. Data lakes provide speed of analysis for unstructured data. Users can upload data into the lake quickly and easily and begin analysis on that data immediately, even in very large data sets. Data lakes can consume unstructured data such as geo-location, demographics, marketing, social, commodity trend, weather data, IoT and more. Data science can be applied to assist in decision making in these areas to determine where opportunities may exist to streamline the process or reduce risks. However, lakes alone are not effective for application or highly structured data. This is also not always an either/or proposition as many times a business analyst will need business context from inside the organization to help organize the unstructured information into logical groupings for this analysis such as regions, lines of business or customer segments.
As a result, the data Hub model is emerging to serve these needs more quickly and with much lower costs than old data warehouse models. In a data Hub, the master data (MDMs) and other organizationally relevant dimension data can be logically managed across sources and governed in a single-source-of-truth solution. Unlike data warehouses, a Hub can be coded in an agile and iterative manner allowing business priorities to drive the coding timelines. This Hub model is important for use cases required to blend disparate sources such as legacy systems, systems from mergers and acquisitions, and new cloud platforms into a single, meaningful data framework for the organization. New open source solutions are starting to be available for the MDM portions of the architecture further lowering costs, while open source coding options such as Python can bring an AI-like automation capability to technically complex data tasks performed in the past by highly skilled consulting teams. Strategic use of open source options can significantly reduce maintenance, testing, and coding costs while bringing increased speeds in time-to-value for business leaders with higher returns on investment.
It is important for the organization to effectively build strong organizational partnerships and communication models between IT, business process management and analyst functions
In the Hub, the other important priority is data processes like data stewardship and data governance. In the master data structures, data governance provides a single source of truth in key areas of the business governed by business practice leaders. A Hub also provides real-time updates to important master data to all consumers in real time, even feeding updates into self-service BI. This new technology is also less time consuming for the business when business process work spans multiple applications. The agile modeling allows the work to take place across system applications without requiring a business user to identify all possible reporting and data requirements for every impacted system up-front. Time-to-Value is improved by eliminating weeks of business requirement sessions to identify all system requirements prior to coding beginning.
Another key area of speed in newer technologies is in the area of data movement or data integrations. Companies who are adopting API technology and raising the bar higher than overnight batch processing models will come out on top. Business leaders can help drive value by adding API and data accessibility into their RFP processes up-front when working with third parties and software vendors. Where data movement used to be seen as a necessary evil of IT, it should now be seen as a key differentiating factor for business leaders. The faster the data moves, the faster business leaders can make relevant decisions. The innovation in this space is also greatly reducing overhead and maintenance with innovations like self-healing, delta-based ELT logic (Extract, Load then Transform). The reduced costs of data storage in newer cloud technologies allows data to be taken in a broader and more generic scopes. This eliminates months of requirement gathering and high-effort integration development to pull only what is needed field-by-field. This has an added value of allowing a more broad set of self-service data services to be created for consumption for analysts, customers, other business users, and third parties.
We are in a very exciting age of data availability. Because these technologies compound in value as more data is added, it will be the companies who are best able to quickly recognize the value and embrace the opportunities who will advance very quickly. It is important for the organization to effectively build strong organizational partnerships and communication models between IT, business process management and analyst functions. If that partnership is effectively created, Business Process Management will benefit from the expertise in each group to fully embrace these new and exciting technological advancements in process improvement initiatives.