Austin Guyette, Author at Perficient Blogs https://blogs.perficient.com/author/aguyette/ Expert Digital Insights Thu, 19 Apr 2018 18:23:30 +0000 en-US hourly 1 https://blogs.perficient.com/files/favicon-194x194-1-150x150.png Austin Guyette, Author at Perficient Blogs https://blogs.perficient.com/author/aguyette/ 32 32 30508587 A Digital Business Platform for Financial Services https://blogs.perficient.com/2017/06/22/a-digital-business-platform-for-financial-services/ https://blogs.perficient.com/2017/06/22/a-digital-business-platform-for-financial-services/#respond Thu, 22 Jun 2017 15:00:02 +0000 http://blogs.perficient.com/integrate/?p=3890

This guest blog post was authored by Alfresco’s Raphael Allegre

Complying with regulations and acquiring new customers were identified as the top 2 challenges facing the retail banking industry in a recent global survey  conducted by PwC. 

Regulatory Compliance

Anti-Money Laundering (AML), Basel II, Sarbanes-Oxley Act (SOx), OMB A-123, Data Privacy, Consumer Privacy, Check 21, SAS 70, BSA, PATRIOT Act, MiFID and Reg NMS are just a few of today’s existing fsi regulations. New or revised regulations are coming fast. The EU data protection laws (GDPR) coming in May 2018 and affect all organizations holding information on EU citizens. The potential penalties are severe ($20m or 4% of global annual turnover). It is estimated that the direct financial impact of regulations on European and US banks In 2014 alone, was US$65 billion in fines and penalties.

And that’s before you’ve even thought about the (possibly greater) indirect costs of hiring the required skills, transforming operations and continuously investing in new technologies to meet current and future regulatory compliance in ever-changing environments.

Attracting New Customers

The second reported top challenge for global banks is responding to a new threat from the emergence of non-traditional digital disruptors. These are challenging the established order by leading with customer-centric innovations delivering better service and greater value. Given the cost and competition pressure levels in the financial sector, banks are hungry for growth, and finding new customers is the first response for many bankers. Also knowing the higher cost of getting new customers versus keeping existing ones, transforming the customer experience is usually the ultimate answer for both keeping and attracting new clients.

How Do These Challenges Impact Day-To-Day Operations?

As new regulations were introduced over time, banks adopted point solutions to manage compliance activities in silos by separate lines of business (LOBs) of the bank. This has led to creation of duplicate efforts and disparate data sets, documentation processes for risk reporting and testing. As a consequence of multiple disparate systems, employees are struggling to find the right information at the right time resulting in lower operational efficiency. Jumping back and forth between applications, emails, and local Excel spreadsheets, and scattered information creates an error-prone environment. In classical investor services back office operations such as account opening, customer onboarding and account maintenance, meeting regulatory standards means being able to prove you followed the right procedure with the history of all completed operations. When these multiple systems are not integrated it is costly and impossible to generate a comprehensive audit trail of end-to-end operations.

The capability of rapidly acquiring new customers relies on greater on the efficiency level of the key business process efficiency across all customer touch points (i.e onboarding, contract management, customer service etc.). Unfortunately, the flow of information is often slowed down or stopped by labor-intensive operations, lack of process visibility, poor integration between systems, lack of contextual information to take decisions and difficult collaboration across LOBs.

Transforming the Customer Onboarding Process

What are the capabilities that banks needs to meet compliance with minimum intervention and deliver a greater experience for onboarding customers?

Seamless process automation & orchestration
The first critical need is to automate all the manual and error-prone tasks and orchestrate the entire flow of interactions from end-to-end. An automated digital onboarding process would act  as the glue that connects multiple LOBs, content and systems together. From a customer point of view, digital process automation is making the entire experience more engaging with self-service data entry, real-time visibility & communication as well as fast delivery of service. From a banking perspective, process automation not only brings speed but also better oversight on business operations with the ability to identify bottlenecks in real-time and fix them fast. It also brings full historical data on past operations to serve intelligent analytics and process optimization. Finally, digital process automation also helps improve employee satisfaction by reducing frictions due to operational inefficiencies and manual errors.

Content management plus process automation
All documents collected during the onboarding need to be gathered in a single place, searchable, secured and contextually accessible anytime throughout the process. Information captured during the process or pulled from any LOBs application should be convertible as a document providing contextual insights for decision-making activities or being part of the process outcome.

Low-effort Compliance
Meeting regulatory compliance for the onboarding process requires being able to demonstrate that collected customer information has been securely stored and managed along the entire process. The flow of completed transactions driven by the process engine needs to be automatically converted as a record ready for third-party regulators and long-term retention. Documents needed for regulatory compliance need to be handled automatically during the process to avoid extra effort after onboarding.

The Open Digital Platform to Help You with These Challenges

The Alfresco Digital Business Platform (DPB) is purpose-built to move beyond what traditional packaged applications provide. It accelerates development of digital solutions where processes flow seamlessly, content is presented in context, and regulatory compliance is met. Instead of focusing on BPM or ECM or Record Management (RM) in isolation, the Alfresco Digital Business Platform is the only platform on the market today that brings these core capabilities together.

What’s Unique?

By simplifying business operations process flows, by integrating content with business processes across physical, virtual, cloud and hybrid cloud environments, the Alfresco Digital Business Platform is helping organizations deliver business and customer benefits in days or weeks rather than months. Organizations are leveraging Alfresco’s open platform to take business processes that used to require expensive and lengthy one-off IT projects and are now engineering them into a model-driven platform. Kyle Pause, director of SaaS Platform Development at Pitney Bowes declared in a recent article “as Pitney Bowes develops and brings to market new SaaS offerings for our clients, we can now wire them to our customer onboarding business processes in a matter of hours instead of days or weeks.”

Adding Perficient’s Trex to Complete the Solution

Working in combination with Alfrresco’s Digital Business platform, Perficient’s Trex provides a highly configurable and extensible user interface layer with a template designed specifically for the financial services industry. Trex addresses pain points in the financial services industry by:

  • Streamlining manual processes by automating workflows and business rules.
  • Working in combination with Alfresco to reduce and/or eliminate paper documents
  • Ensure regulatory compliance by making sure that a consistent process is followed every time for every customer account
  • Providing a modern digital experience which meets the needs of evolving investors.

 

For more information, go to www.alfresco.com, https://www.alfresco.com/partners/perficient-inc, and www.perficient.com

]]>
https://blogs.perficient.com/2017/06/22/a-digital-business-platform-for-financial-services/feed/ 0 196388
Common Enterprise Security Risks https://blogs.perficient.com/2017/04/04/common-enterprise-security-risks/ https://blogs.perficient.com/2017/04/04/common-enterprise-security-risks/#respond Tue, 04 Apr 2017 17:00:18 +0000 http://blogs.perficient.com/integrate/?p=3394

Today’s cybersecurity climate necessitates a sound understanding of security risks across your enterprise IT infrastructure from threats both outside and inside your organization. While outside threats seem to be the most worrisome and probably account for the majority of attempted attacks, according to SpectorSoft insider attacks cost companies a combined $40 billion in 2013. Most organizations will say they have a robust security strategy but how many of these common security risks has your IT department addressed?

Applet viewers

As discussed in my previous blog post, The Plugin Free Web, applet viewers represent a significant security risk. Eliminating the use of applet viewers and moving to an HTML 5 viewer is a must for all IT departments.

 

Using deprecated hash algorithms in SSL certificates

As technology evolves, it is critical to stay ahead of those who wish to defeat cryptographic technologies and using proper hash algorithms in SSL certificates is imperative. SHA-1 is a deprecated hash algorithm that should no longer be used inside any organization and, as of 1/1/17, all certificates should be using SHA-256.

 

Make sure your infrastructure is up to date with patches and fix packs

Most organizations take an “if it ain’t broke, don’t fix it” approach when it comes to version updates to their software but that short-sighted thinking can lead to trouble. Without regular maintenance to all components of your platform, releases containing critical security patches and major user-facing fixes can necessitate daunting, multi-version upgrades leaving your organization vulnerable while you scramble to complete the fix. Regular maintenance of Web Servers, Load Balancers, Application Servers, Operating Systems, and Java is critical for security keeps your organization running smoothly. For more information, please read Eric Walk’s excellent blog post on this topic.

 

Bypassing SSL and creating security holes into your organization

When leaving the house you wouldn’t lock the front door while simultaneously leaving your back door open, so why do some organizations allow people to bypass SSL in their production environments? SSL is only effective if you are encrypting information at all layers of your system, otherwise, you are opening up security holes in your system and potentially allowing HTTP session hijacking with a simple URL change.

 

Guessing document versions

Modern ECM systems often use GUIDs as document identifiers, but legacy systems may have used an integer sequence to generate its document ids. In fact, FileNet Image Services and IBM Content Manager both use integers in some way to identify a document. Worse, legacy systems usually fail to implement document level security. This means that a URL hacker (link) might be able to access arbitrary documents by simply viewing document id 12345677 and then changing the URL to point to 1234578. This can be a major problem for all systems containing sensitive information and, for systems containing health records, this is a serious HIPAA compliance risk.

 

Poor password and access policies

Shared application or database IDs; giving users access to things they don’t need; and storing encryption keys, default credentials, and passwords in files are all examples of weak policies that compromise security. Administrative accounts are the crown jewel for malicious persons seeking unauthorized access to a system and shared application or database IDs represent a single point of failure for your entire system’s security. Users should only have access to the things they need and weak access policies can expose sensitive data to people that don’t need access to it. On a similar topic, passwords should not be given to operating system users that don’t absolutely need them. Finally, storing secrets where they may be readily accessible to savvy individuals who can access the files on your web server is NEVER a good idea.  Products like Vault are a good way to make strong password and access policies easy to implement.

 

Virus-laden downloads and committed documents

Each employee is in charge of their own workstation, and downloads to their PC can pose a security threat to your entire organization. No matter how a virus is downloaded, you’re lucky if the damage is contained to that single person’s computer. Unfortunately, if a virus-laden document is committed to your ECM system that virus can spread to your entire organization, giving attackers access to your most precious data.  This risk can be reduced by adoption of HTML5 viewers, which make downloading documents unnecessary, but only proper network isolation can truly contain this threat.

 

Logging of personally identifiable information (PII) in trace logs

Yes, proper trace logging helps you identify problems with your environment but PII is sensitive information that should NEVER appear in trace logs. This threat is more likely to occur accidentally when increasing logging levels to resolve an issue. Once PII is in your logs it can become very difficult to remove without deleting the entirety of your organization’s trace logs. Developers and administrators should take proper steps to ensure that PII can never be logged, even accidentally.

 

Prod vs Dev Clarity

Mistakes happen and proper separation of your environments is essential for limiting the cost of an accidental reboot or an accidentally dropped database table. Having explicit procedures for changes in production and clear separation of your working development environment from your production system means that mistakes in DEV have little impact on your real data and users.

 

Defining, recognizing and handling risky behavior

As I mentioned earlier, insider attacks are extremely costly for organizations and defending against these types of attacks requires your organization to define, recognize and handle risky behavior. According to PWC employees who committed cyber crimes exhibited suspect behavior beforehand and if your organization has a protocol for handling such behavior, you may be able to stop insider attacks before they happen.

]]>
https://blogs.perficient.com/2017/04/04/common-enterprise-security-risks/feed/ 0 196342
The Plugin-Free Web: Is Your Business Ready? https://blogs.perficient.com/2017/03/06/the-plugin-free-web-is-your-business-ready/ https://blogs.perficient.com/2017/03/06/the-plugin-free-web-is-your-business-ready/#respond Mon, 06 Mar 2017 16:00:14 +0000 http://blogs.perficient.com/integrate/?p=3020

John and Amy sit down with their morning coffee and start browsing the news. Unfortunately, they both get served an advertisement with a malicious applet inside. John’s computer is passively infected with a virus which spreads inside his company’s network but Amy’s computer is unaffected and her company is safe. What’s the difference between John and Amy? John is using IE11 as mandated by his company’s IT department with NPAPI enabled; allowing the malicious applet access to his computer. Amy is using MS Edge which doesn’t allow the applet access. It’s safe to say that any business would rather have Amy than John in this situation.

Why are we talking about John and Amy? It’s because a plugin-free web is coming and organizations may not understand what this means for their IT security. Oracle has announced that they will deprecate the NPAPI (Netscape Plugin API – which allows Java applets to run in a browser context) with the release of Java 9 and many browser vendors have removed support for standards-based plugins entirely. The situation with John and Amy is a real risk for organizations that is only growing and these announcements are important because they may require organizations to update their systems in order to eliminate security risks.

The current landscape may lull some organizations into a false sense of security. Internet explorer through version 11 and Firefox versions 53 and lower still support Java applets, so businesses that rely on applets for document viewing and annotations may feel like change is still years away. The risk in this philosophy does not lay with the applets or browsers themselves, but instead with the NPAPI that allows the applets to run in the browser.

Currently, applets represent security risks due to that fact that, if there is a vulnerability in an applet, the attack surface for malicious code is large because any web page can access the applet when it is running. This security risk is exacerbated with the deprecation of the NPAPI because that eliminates the possibility of remediation for any future vulnerability in applet-based functionality. Simply put, if someone wants to rob you and they find the door to your house open, you cannot shut the door and the robber can walk in with all of his friends.

Your business can eliminate this security risk by moving to other technologies in lieu of applets before the release of Java 9 (looking like late July 2017). In the content management world, a specific example is that many organizations are moving to an HTML5 document viewer instead of the traditional Java applet viewer for image viewing and annotations. The two options have similar functionality but an HTML5 viewer eliminates the security risk now presented by the Java applet viewer.

]]>
https://blogs.perficient.com/2017/03/06/the-plugin-free-web-is-your-business-ready/feed/ 0 196313
Business Process Management Evolution, Part 5: Efficiency, Consistency, Compliance. Trex. https://blogs.perficient.com/2016/12/19/business-process-management-evolution-part-5-efficiency-consistency-compliance-trex/ https://blogs.perficient.com/2016/12/19/business-process-management-evolution-part-5-efficiency-consistency-compliance-trex/#respond Mon, 19 Dec 2016 16:00:45 +0000 http://blogs.perficient.com/integrate/?p=2794

Efficiency, consistency, and compliance are three core tenants of Perficient’s Trex BPM solution. A BPM system should be efficient and help users achieve maximum productivity with minimum wasted effort or expense. A BPM system should be consistent – it should help users process work by ensuring that the same procedures are followed every time a task is completed. Finally, a BPM platform should enable compliance and help your organization ensure that compliance standards continue to be met and followed.

Trex is efficient. Perficient’s Trex BPM solution can help your organization reach maximum productivity while simultaneously achieving cost savings in both business units and your IT department. Trex includes out-of-the-box features that enable automated workflow processing and reduce dependence on complex manual processes that extend transaction times. By distributing work in a structured manner, integrating with legacy systems, and effectively managing exceptions Trex dramatically reduces the processing cycle, increasing productivity.

Trex uses a queue-based framework to organize incoming requests by transaction type and provides a customized processing interface for each. Customer request processors have the power to access and collect transaction-relevant content without compromising their ability to work in a legacy environment. Accuracy can be improved by 80% and the time required to resolve exceptions can be decreased by up to 60% – allowing organizations to respond immediately and accurately to customer service inquiries, resulting in significantly enhanced customer satisfaction. Overall, Trex can boost efficiency by up to 20% resulting in significant cost savings and decreased time to revenue.

Trex is also consistent. Perficient’s Trex BPM solution ensures that your business processes are executed consistently each and every time. Back office processors are only presented with relevant information at each step of the workflow which helps clarify the task at hand. Automatic Get-Next functionality allows managers to decide what users are processing and doesn’t allow cherry-picking of work. Audit logging allows internal auditors to randomly review a processor’s responses to each step of a workflow, so the auditor can evaluate whether the work was completed used correctly. During the evaluation, auditors can create a list of errors and comments, which can be sent back to the processor for resolution.

Finally, Trex helps your organization remain compliant even against a backdrop of ever increasing and changing regulations. Trex ensures compliance in a variety of settings and can help organizations navigate everything from Know-Your-Customer laws in the onboarding process to the intricacies of the Fiduciary Rule. Trex provides reports that can be used to show compliance and help with management tracking and Trex also includes a document checklist, which integrates with a rules engine to ensure that the right documents are identified for each transaction, reducing or eliminating errors and omissions. Trex can also help with document lifecycle management by triggering work items when documents reach a certain age and by purging documents at the right time. Finally, by integrating with 3rd party databases, Trex’s rules engine can help identify customers with a high-risk profile during the client onboarding process.

Efficiency, Consistency, Compliance. Trex.

]]>
https://blogs.perficient.com/2016/12/19/business-process-management-evolution-part-5-efficiency-consistency-compliance-trex/feed/ 0 196292
Business Process Management Evolution, Part 4: Trex for Client Onboarding https://blogs.perficient.com/2016/12/05/business-process-management-evolution-part-4-trex-for-client-onboarding/ https://blogs.perficient.com/2016/12/05/business-process-management-evolution-part-4-trex-for-client-onboarding/#respond Mon, 05 Dec 2016 16:00:32 +0000 http://blogs.perficient.com/integrate/?p=2678

In this blog series we examine BPM from multiple perspectives and you can find my last post Part 3 – The Perficient Approach here. Part 4 of our series looks at specific challenges faced by investor services firms in the client onboarding process.

Unique Challenges in the Investor Services Industry

Did you know that a large, Tier 1 bank can spend more than $100 million a year on client onboarding, when you consider information technology and operations? Since onboarding is the critical, first client experience with a firm, why does the process remain largely manual, error prone, time consuming, and risky in terms of compliance with global regulations and the client experience? There are four primary reasons that client onboarding remains a pain-point for investor services firms:

  • Complex Manual Process are largely paper-based and involve 300-600 questions, 5-10 workflows, 30-150 documents and 200-300 business rules.
  • Re-keying of Data Increases Errors and multiple forms requiring manual data entry are common. This introduces errors such as incomplete or illegible responses and typos, adding cost and increasing cycle time.
  • Relatively inflexible processes mean that new product introductions, changes in regulations and compliance policies, and organizational changes highlight shortcomings.
  • Processes require significant training to correctly execute which means niche skills acquired over time and staff turnover contribute to operational inconsistencies and increased compliance risk exacerbated by changing regulations and requirements that vary by jurisdiction.

But if your firm is still doing well with your current client onboarding process why would you choose to revisit something that doesn’t seem to be broken? Three drivers in the industry will only magnify problems in your onboarding process and are catalysts for many firms to improve their client onboarding experience.

  • Demographic and intergenerational shifts in wealth are occurring due to the increase in women investors, the rise in middle class investors from emerging markets, and the transfer of wealth from baby boomers to Gen Xers and Millennials. These newer demographics are used to “anytime, anywhere, any device” access and firms that cannot offer these capabilities are seen as antiquated among newer investors.
  • Regulatory concerns dominate the industry where global banks have collectively paid fines in excess of $15 billion after being cited for violations of KYC and AML laws. In addition, excellent documentation processes must be in place to demonstrate your practices and charges to clients.
  • Changing technology standards enable new ways to streamline transactional processes, share data, and eliminate operational handoffs and paperwork. All clients, regardless of age or demographic, expect a seamless, digital, onboarding experience and legacy systems cannot deliver a modern digital experience.

Perficient BPM Streamlines the Client Onboarding Process

Perficient’s Business Process Management solutions, including Trex for Investor Services, can help streamline the client onboarding process while lowering your risk for errors and omissions. Our BPM expertise can help firms address pain-points in the client onboarding process by:

  • Streamlining manual processes by automating workflows and business rules. Perficient’s Trex BPM platform allows documents to be routed automatically to the right person at the right time and information is presented to the person processing work only when it is relevant to the task which makes processing requirements clear.
  • Reducing and/or eliminating paper documents by storing information electronically in a secure environment. Documents are stored securely and can be accessed as needed by multiple people simultaneously, helping speed time to revenue. Also, search mechanisms help locate information that has already been obtained so you don’t ask clients for the same information twice. Robust disaster recovery strategies can be implemented to ensure documents are backed up and secure at all times.
  • Ensure regulatory compliance by making sure that a consistent process is followed every time for every customer account and providing reports that can be used to show compliance and help with management tracking. Trex also includes a document checklist which integrates with a rules engine to ensure that the right documents are identified for each transaction, and errors and omissions are reduced or eliminated. Trex can also help with document lifecycle management by triggering work items when documents reach a certain age and by purging documents at the right time. Finally, by integrating with 3rd party databases, Trex’s rules engine can help identify customers with a high risk profile during the onboarding process.
  • Providing a modern digital experience which meets the needs of evolving investors. Trex can integrate with mobile applications for advisors to provide information updates and helping speed turnaround for onboarding and transactions.

With the efficiency, consistency, and compliance offered by Trex, investor service firms can streamline their client onboarding process and automate back office processing to eliminate bottlenecks in their business processes.

]]>
https://blogs.perficient.com/2016/12/05/business-process-management-evolution-part-4-trex-for-client-onboarding/feed/ 0 196269
Business Process Management Evolution, Part 3: The Perficient Approach https://blogs.perficient.com/2016/11/14/business-process-management-evolution-part-3-the-perficient-approach/ https://blogs.perficient.com/2016/11/14/business-process-management-evolution-part-3-the-perficient-approach/#respond Mon, 14 Nov 2016 16:00:37 +0000 http://blogs.perficient.com/integrate/?p=2624

In this blog series, we examine BPM from multiple perspectives and you can find my last post Part 2 – BPM as a Solution here. Part 3 of our series analyzes Perficient’s approach to BPM.

The Perficient Approach

The Perficient approach to BPM is predicated on the idea that large production BPM solutions will always require design and customization to fully exploit the promise of the tools.  In each organization, there will be some unique and custom features that, based on their repeated usage, will provide tangible benefits.  This typically includes integration with existing systems, specialized forms and calculators, and other special capabilities that automate frequently performed processes.

At the same time, through 30 years of experience implementing BPM, Perficient staff have developed scores of specialized features that are used consistently in most high volume, production systems but are not included in typical out of the box vendor BPM tools.  This has led us to see the need for a base user application layer that can provide these additional standard tools but is also designed to be highly configurable and extensible to meet the custom needs of large clients.

Trex Templates to bridge the gap to industry solutions

Trex is an application that sits on top of BPM and ECM software to provide a highly configurable and extensible user interface layer designed specifically for high volume, production BPM solutions.  It includes many features specifically designed for high volume environments that are never found in typical out of the box BPM tools.  Furthermore, it is designed to be configured and extended to make high volume, production systems as efficient as they can be.

As any software discipline matures over time, there is a steady movement from highly customized software to product software.  Over time certain applications are built repeatedly, and someone invests in building a product version.  If the market continues to expand, these product versions become more sophisticated and more varied and custom development is left for requirements that are truly unique to each organization.  Implementation becomes more efficient because fewer parts of a system need to be built from scratch.

Perficient sees Trex as the first step in this process for high volume, production BPM applications.  We have developed a set of Application Templates for several common production applications including:

  • Investor Services
  • Loan Operations
  • Life Insurance and Annuity
  • Property and Casualty Insurance

These templates are preconfigured base systems that include routing maps, data definitions, user interface definitions, and certain custom features that are specific to an application area.  They incorporate best practices learned from multiple implementations but they are meant to be extended and customized based on the unique requirements of a client.  They provide a further head start in the process of designing the BPM environment and avoid the implementation team having to redesign basic functions that have been developed many times before.

Trex methodology to complete the picture

Because building a good BPM solution requires full participation by both IT and the business, it requires a methodology that encourages collaboration.  In addition, because a good BPM solution often makes radical changes in the way the business performs and manages work, it requires strong facilitation to help the business understand the possible options.  Some middle to lower level business users will tend to resist major changes in how they do their work and how they are organized and will want the new solution to conform to the way they are used to doing things. This resistance negates the value that BPM technology can bring to the process redesign.

Perficient has developed a BPM design methodology around Trex and the solution templates.  While Trex itself and the appropriate template provide a base to help both business and IT understand the core capabilities of BPM tools, the methodology is built around a series of interactive, facilitated design sessions that are used to jointly design the work classes, workflows, and major screens that comprise the reengineered processes.  This allows the kind of give and take needed to efficiently consider many options while also fully understanding the business and technical constraints that may not be apparent at first glance.

Based on the goals and objectives of the business processes, the design sessions and follow up work create a system concept for the BPM solution that describes in functional terms how the solution works and how users will interact with it.

Many BPR methodologies start with an “as-is” phase, in which the design team carefully documents all of the current processes.  We do not include it in our standard methodology because this approach tends to focus analysis on the details of current procedures instead of the purpose of the business processes being analyzed.  Often, lower level staff are so immersed in the details of the process that they do not fully understand the ultimate business objectives.

Once a new concept for the BPM solution is created and approved by stakeholders, we can proceed with a more traditional system implementation approach which can be tailored to the clients’ standard development processes.  Perficient’s approach has been used to successfully implement many high volume, production BPM solutions across many industries.

 

Trex as a foundation

The following sections describe a small sampling of some of the unique features that Trex contains which make it so well suited as the basis for a high volume, production, human-centric BPM solution.

queues

Hierarchical Queue Tree

Most production systems have a large number of queues representing various discrete functions and separate organizational splits of work waiting for that function.  A typical system can have more than 100 separate queues to organize the work.  Trex allows you to configure these queues as a standard hierarchical tree in which specific branches can be expanded and collapsed.  While each user will only see the queues to which they have access, this can still be a large number for management and supervisory users.  Most systems just list the queues, making it difficult to find a specific queue or requiring a separate queue search capability.  With the queue tree, each user can expand or collapse the tree to expose the queues they use frequently while still having quick access to all the queues.  Trex remembers the state of the queue tree for each user for further efficiency.

 

Get Next Processing

High volume, production BPM by definition includes many transactions that are repetitive and somewhat mechanical.  Users processing these transactions do not need the ability to browse through a list of potential transactions.  This just wastes time, and in most cases, the business would rather set up standard rules defining any priority ordering scheme for the transactions.

If a user is processing this type of queue, the system should just present them with the next work item based on configured rules.  As soon as they complete the work item, it should give them another until all work items are completed or until they leave the queue.  This ensures the highest level of worker efficiency and that work is done based on business priorities.

Trex allows each queue to be configured for Get Next or Browse access for each role.  If a queue is Get Next only, the sort configured for the queue will determine the order in which work items are assigned to users who are ready for work.  An additional capability allows the definition of filters so that certain roles will only be assigned work items for which they are trained.  Each queue can also be set to push work items that are specifically assigned to an individual first before they get other unassigned work.  Together these features allow for great flexibility in automatically pushing work out to users.

Browse Tree

In some cases, it is appropriate to give some or all users the ability to browse the work items in a queue and manually select which ones they will access.  This makes sense for a queue where processing order is determined by external events.  For example, work items might be selected based on incoming phone calls.  It also makes sense for supervisory users who have responsibility for managing the queue.

But in a high volume, production system, queues could have thousands or tens of thousands of worbrowse-treek items.  How can you effectively browse through such a large number of items?  Trex uses the concept of browse trees to summarize the contents of the queue based on the possible values of specific preconfigured fields.  Each level in the tree is defined by a configured work items field that has a reasonably limited number of potential values.
When the queue is selected, Trex first builds and displays the browse tree based on the current values of work items in the queue.  The tree only includes branches for work item field values that are currently represented in the work items in the queue.  Each queue can have a different set of work item fields configured.

Each branch of the tree lists the number of work items at that branch.  The user can select a branch at any level of the tree, and Trex then displays the list of work items corresponding to that branch.  So the user can decide based on knowing the number of work items that will be displayed.  Trex displays the work item list in pages since in a large system it could still be hundreds of work items or more.

Integrated Content and Work Items

While BPM and workflow processes do not require unstructured content, most high volume, production systems have a need to associate some content with work items.  These could be individual forms filled out by customers, backup documentation on which decisions are based, or copies of the exact correspondence sent to the customer.  Efficient handling of this associated content is critical to making a high volume, production system efficient and flexible.  Trex leverages ECM technology to provide a unified user application that efficiently handles both the work items and any associated content.content-and-work-items

The screen above displays a single work item including various work item data.  The work item also includes a list of document attachments.  The user can see properties of the documents and has access to a number of features including viewing one or more of the documents, checking revisable documents in and out, re-indexing individual documents, or viewing the version history of a document.  Each queue can also be set to automatically load documents when the work item is opened for those work steps that usually require viewing the documents.  This can be very efficient for high volume steps.  For longer lists of documents, the user can toggle the attachments pane to a tree structure that organizes the documents by common property values.

Document Checklist

Beyond simply integrating with ECM, a well-designed system ensures that all content necessary for a business process is contained in the workflow. Trex retrieves a list of required documents based on work item or external information. The Document Checklist can be configured to force the correct set of documents to be present before a work item can proceed to the next step in processing and is integrated with a rules engine to associate specific document sets with different types of work items.

Pend Functionalitypend

A standard function of virtually every high volume, production system is the ability to put work aside while waiting for a response from an external party.  This requires the work to be available in case the response comes in as well as to automatically be presented to the user if no response is received in a set amount of time.  Trex calls this capability Pend and includes it in the base system functionality.

Split and Link

In complex systems, separate work items may be created that relate to the same item of work, or, conversely, one work item may include elements that are really two separate items of work.  Any large system needs to provide users with tools to combine individual work items into one work item or split one or more work items from an existing work item.  This can happen because a trailing document comes in and triggers creation of a new work item rather than being rendezvoused properly, or a document is mis-indexed and included in a work item with other documents.

Few BPM tools include this type of functionality, and it usually takes developers who are new to BPM workflow some time to realize the need.  Then they need to custom build something.  Trex, built with a full understanding of complex high volume systems, comes with functionality to both split and link work items that can be configured to be available from any screen.

Work Search

The ability to search for work items is critical to any high volume system. In a production system containing thousands of work items, finding a single work item by simply browsing queues is impossible and users should not have to contact the IT department to perform work item searches. Trex allows users to perform searches for current and archived work items using pre-configured search templates that make searches more efficient. Search results can be sorted and filtered by users in order to fine-tune results.

External Data Lookup

Large scale, production workflow systems always work in concert with other systems.  Typically the system of record will be a preexisting data system.  While some data may be duplicated in the BPM system for efficiency, it does not make sense to duplicate any data that is volatile.  Thus, there is a need to configure data lookups to support functions such as drop-down lists populated by external reference data, automated lookup of related data based on a single account number or SSN, and field verification against a list of known valid data values.

Trex includes all of these capabilities through configuration as long as data sources are available.  If not, code can be built to expose the data to Trex, which can then be configured to use it.

Extensible Button Architecture

Customers who need to implement high volume, production systems always need some level of customization to make a system meet their needs.  While efficiency dictates that you want to minimize this customization, large organizations have specialized needs.  To provide the benefits of a product, but provide the flexibility needed for real world applications, Trex is designed to be extensible.  This means that custom code can be written and configured to run based on a button on any of the Trex user interface screens.  This custom code can be added without touching any of the existing code so that errors are not introduced.

For complex functions, Trex classes can be called directly by the custom code to perform various server side functions.  Complete custom subsystems can be integrated into Trex while maintaining the core code.  This makes rolling out new releases much easier than if the base code were modified.

Modifiers

Streamlining the work process by small amounts in enterprise systems can save organizations large amounts of time and money due to the high volume of work items being processed. In order to save small amounts of time during each processing step, Trex can be configured to display only the required information for that step which reduces the time processors spend on each individual step. Through the use of modifiers, fields and buttons can be displayed, hidden, enabled, disabled, required, or not required, based on information input on the work item processing interface. Date fields can also be configured to automatically populate with the current date and time.

User Controlled Management of Permissions

Most BPM and workflow systems provide a role-based mechanism to control user authorization and privileges.  But typically these permissionsfunctions are meant to be controlled by security or IT staff.  An efficient high volume, production system must provide first line supervisors and other managers with tools to manage their workforce, that is, the users processing their queues.  They cannot wait hours or days for someone else to change a permission or provide access to a certain queue.

While being designed to use LDAP for authentication and to default membership in certain configured roles to LDAP groups, Trex provides a far more detailed and fine-tuned system of roles and user permissions that provide managers and front line supervisors the tools they need to manage their staff and their queues in a real time environment.  They can adjust priorities, redirect users to different queues, and even create sub roles to manage their work effectively.

Work Item Field Security

In large enterprise implementations, some processors are not trained to handle all aspects of the work associated with a work item. To prevent errors in processing, Trex provides security based on work item field values when filtering by queues alone is not sufficient for the desired work distribution model. This allows for the most flexibility in an organization’s security model and ensures that only people with appropriate credentials can modify the fields of a work item.

Integrated Queue Management Console

To make adequate use of their access to role-based permissions for users, managers and supervisors must have access to sufficient data to understand not only the current state of the workload but the trends and projections for the work.  Trex includes a built-in console to give users access to detailedqueue-management queue counts, work trends, user assignments, and projected time to complete work.  This information is critical for supervisors and other managers to determine how to deploy the people they have available to meet their priorities.

These and other specialized features make Trex ideally suited as a key component in high volume, production BPM solutions.  Projects that start only with out of the box vendor tools will either be faced with significant customization or more likely, fail to provide the capabilities needed to develop a highly efficient solution.  Many of these features will only be understood after trial and error with successive versions of the solution.

 

Acknowledgements
This blog post is an excerpt from Peter Gretz’s white paper, “Effectively Applying BPM to High Volume, Human-Centric, Production Operations”.

]]>
https://blogs.perficient.com/2016/11/14/business-process-management-evolution-part-3-the-perficient-approach/feed/ 0 196265
Business Process Management Evolution, Part 2: BPM as a Solution https://blogs.perficient.com/2016/11/07/business-process-management-evolution-part-2-bpm-as-a-solution/ https://blogs.perficient.com/2016/11/07/business-process-management-evolution-part-2-bpm-as-a-solution/#respond Mon, 07 Nov 2016 16:00:35 +0000 http://blogs.perficient.com/integrate/?p=2502

In this blog series. we examine BPM from multiple perspectives, you can find Part 1 – BPM as a Tool here. Part 2 of our series analyzes BPM as it relates to business processes and defines the elements of a good production BPM solution.

BPM as a Solution

Instead of viewing BPM as a specific tool that is implemented across the enterprise, an alternative view looks at BPM as an approach that provides a solution to inefficiencies in a specific or related set of business processes.  BPM is not a tool that you buy and install.  Rather, it is a solution that you implement, combining various automated tools and process design techniques, to create streamlined work processes.  A BPM solution meets specific requirements for a related set of business processes in an organization.

Production BPM Solutions

One key market that Perficient believes is underserved by today’s BPM tool focus on ad-hoc, collaborative, low volume workflows is high volume, human-centric, document-based, production BPM systems.  These are the types of applications that were often the mainstay of early workflow and BPM efforts and seem to be considered by some industry analysts now as boring.  However, they still represent a large opportunity for improving efficiency, removing cost, and increasing customer responsiveness in the core functions of many businesses.

They are typically systems that support large back office operations engaged in functions like claims processing and underwriting, loan operations, account opening and maintenance, mortgage origination, credit card dispute processing, accounts payable, and others.  These are areas of a business that involve many people performing standardized, repetitive processes.  Small improvements in an individual process are magnified by the fact that many people do the same functions repeatedly each day.

Typical high volume, production systems will have hundreds or thousands of full-time users.  Tens to hundreds of thousands of work items may be in the system, and work queues may contain thousands of work items each.  Processors are generally fed work, and supervisors manage teams of users and selected queues.  Often these applications are human-centric because they involve a fair amount of externally generated content that is not cost effective to convert to structured data.  Elements of the work may also include interaction with customers via correspondence and telephone.

These functions were a likely target for early workflow systems that often started as adjuncts to imaging and document management systems.  Most of the automated functions were supported by mainframe systems that stored all the structured data that related to the cases or accounts, but the business operations were mired in paper documents that were involved in most aspects of the processes.

Significant increases in efficiency could be gained just by converting the documents into an electronic format and storing them on-line.  This allowed multiple users to access documents at the same time, sped-up movement of documents from one location to another, and eradicated document loss.  It also allowed the creation of workflow systems in which the process was driven by the receipt of the documents.

However, in some cases the promise of this technology did not move much beyond a system that did an initial sort of the documents and assigned them for work.  Users and IT staff were able to make the shift from paper to computer-based image documents but they could not envision how the availability of these documents on-line significantly expanded how work processes could be organized.  This required melding new technology with business processes and an understanding of business people.

Thus, while these types of applications and business processes were some of the first targets of early BPM tools, the resulting systems sometimes only scratched the surface of the efficiency gains available.  There are still a large number of operations that are ripe for the application of high volume, production BPM solutions.  Simple workflow systems, while easier and cheaper to implement, do not exploit the full value that the business can realize by carefully redesigning their processes and using the full capabilities of different automated technologies that make up a production BPM solution.

The size of these solutions and the potential savings from a large, well-implemented, production BPM solution means that it is cost-effective to design and customize unique features, specialized tools, and system integrations to maximize the efficiency of a particular application.  When there are thousands of users working several work items each per hour, an incremental investment in a feature that saves a few seconds per work item can still provide a significant payoff.  This is a key concept that must be appreciated to implement highly efficient production BPM systems.

Elements of a Production BPM Solution

A good BPM solution must bring together multiple components:

  • Process Reengineering Methodology – The key element is an approach to refining and improving processes in which the solution designer understands and exploits technology, but goes beyond the technology to frame the solution from a business perspective.
  • A high volume BPM engine – This provides the key abilities to design and execute business process maps that embody rules for the movement of work through an organization. It must be able to handle thousands of users processing hundreds of thousands of work items.
  • An ECM repository – This supports efficient access to unstructured data, usually in the form of documents, that are a key part of most production business processes.
  • A user application layer – This provides the user interface that allows human participants in the business process to interact with the work items and related content at each step as efficiently and effectively as possible.
  • Application integration tools – These allow efficient, bi-directional communication with other systems involved in the business process and allows the application layer to provide a common front end.
  • Reporting and analysis tools – A complex business process requires management both in the short term and the long term. Short term management focuses on meeting daily work volume goals, quality goals, and timeliness goals.  Longer term management looks at continuously improving the process to further increase efficiency and quality.
  • Industry solutionspecific expertise and best practices – Experience and best practices for BPM implementation in general. as well as for the specific business areas being automated, are crucial to avoid the many pitfalls that that inexperienced BPM teams tend to suffer.

A high volume, production BPM solution needs all of these elements if it is to exploit the full value that process redesign can bring.

Acknowledgements

This blog post is an excerpt from Peter Gretz’s white paper, “Effectively Applying BPM to High Volume, Human-Centric, Production Operations.”

 

]]>
https://blogs.perficient.com/2016/11/07/business-process-management-evolution-part-2-bpm-as-a-solution/feed/ 0 196256
Business Process Management Evolution, Part 1: BPM as a Tool https://blogs.perficient.com/2016/10/31/business-process-management-evolution-part-1-bpm-as-a-tool/ https://blogs.perficient.com/2016/10/31/business-process-management-evolution-part-1-bpm-as-a-tool/#respond Mon, 31 Oct 2016 12:04:16 +0000 http://blogs.perficient.com/integrate/?p=2509

Over the last 30 years, what is now called Business Process Management (BPM) has evolved from simple workflow systems to a host of products and methodologies. As with any popular business concept, BPM has come to mean many different things to many different people.  At its crux, BPM is a discipline to apply systematic process redesign and employ automated tools to achieve improved productivity, effectiveness, and management of specific, repeatable business processes.bpm-model

A common area of misunderstanding comes when people begin to think of BPM as a technology or a system. BPM is a discipline that combines elements of technology but ultimately relies on bringing technical tools and knowledge together with a detailed understanding of business needs and goals to develop better methods of performing specific business functions.  One of the keys is that BPM cannot be successful without a combined effort of IT and the business. That being said, there are many vendor products that provide useful tools to help implement successful BPM solutions.

In this blog series, we will examine BPM from multiple perspectives. Part 1 looks at BPM from a tooling perspective to analyze the dimensions of a good BPM tool, trends in implementation, and out-of-the-box products.

BPM as a Tool

BPM tools provide a starting point necessary for creating an effective BPM solution. But no out-of-the-box vendor tool can, by itself, handle all of the potential business processes effectively and efficiently.  Business processes are just too varied to be covered by a one size fits all solution that can easily be configured by business analysts. BPM requirements are as varied as businesses and all the disparate processes that comprise a business’ operations.

At one extreme are processes that are appropriate for highly automated straight through processing (STP) with minimal human interaction.  Such a process relies heavily on close integration between existing systems, high-speed transaction processing, and simple user interfaces, if any, for limited exception processing. This type of business process would be best implemented using a product designed to orchestrate services or provide high-speed message processing.

At another extreme is a highly variable, knowledge worker process that may include ad hoc routing, team collaboration, and smaller numbers of long-lived work items. This process might be implemented using just email and standard office tools or with a BPM case processing tool.

A third situation might include a highly routine, high volume, but human-centric set of processes.  This situation calls for a sophisticated user interface that brings all the information together for the user and allows them to perform their routine tasks as efficiently as possible. The routine and high volume characteristics of the transactions allow for a greater return on money spent customizing the application to the unique needs of the process.

While they are all BPM solutions as the term is commonly used today, they require different tools, techniques, and levels of customization to implement efficient solutions. Some tools may provide adequate support for one or more of these scenarios, but no off-the-shelf tool will provide a highly effective, complete solution for all of them without customization.  This is one of the reasons that BPM has become fairly confusing – it just does not make sense to look at it as a monolithic market, especially when looking at the automated tools that support its implementation.

Dimensions of a BPM System

Understanding the full depth of BPM as a discipline requires an understanding of the different dimensions that can define a BPM solution. Numerous dimensions exist that can be used to segment the growing BPM solution space.  Some of the most important include:

  • Level of human versus system processing – In a system that defines BPM as the orchestration of a set of services, the focus will be on simple and highly efficient queuing and routing as well as translation of data among systems. In a system that involves significant human processing, more focus will be on providing efficient user interfaces and work management tools.
  • Long running versus short running workflows – In a system that defines BPM as the orchestration of a set of services, total workflow time may be seconds or less. In a government case processing application, the total time for a case may be years.
  • Attached content versus data only – Workflows that include content are often driven by the receipt and updating of the content. They need tools to allow users to efficiently access and act on this content as an integral part of the process flow. Data-only workflows do not need these types of capabilities.
  • Deterministic versus ad-hoc routing – Knowledge worker systems tend to be more user-directed, at the simplest level becoming like an email system. Other systems derive significant value from having pre-defined routing rules that enforce business standards and consistency in the process.
  • Relatively stable workflows versus frequent workflow changes – Some systems are built with the knowledge that routing paths may change frequently and unpredictably, while others incorporate fairly static routes although rules for following the routes may change. While any system must be built to be flexible to changing business priorities, change is expensive and should only be introduced when necessary. Thus, different design and implementation approaches should be taken depending on the projected need to frequently revise the basic workflows once they have been designed.
  • Full-time users versus occasional use – Users in a back office operation will tend to use the system full-time and be very familiar with it. They will want it designed for maximum efficiency. An occasional user that might use a travel expense workflow might only use it once a week. They need a system that sacrifices some efficiency for ease of use.
  • Large systems versus small systems – Systems with lots of users and volume can justify spending more to customize the workflow and user interfaces. The cost of the extra work is amortized over a much larger group of users.
  • Level of empowerment of users – Does the workflow include rules to drive the assignment of work or do users select from lists? Are tools needed for supervisors to manage the first line users?

This wide variation in BPM scenario characteristics poses a dilemma for standardizing on a single out of the box BPM tool for all your BPM needs.  No single product by itself can do an excellent job in all of these situations.  And these tools must be supplemented with design techniques, subject matter expertise, and traditional system development approaches to handle all of the variations.

The Quest for Painless Implementation

Another trend driving many off-the-shelf BPM tools is the push for simple, fool-proof implementation. Some early BPM tools got a bad reputation because implementation required lots of programming to produce even a simple system. By its nature, human-centric BPM brings together the information from many different systems to provide a single automated desktop to the user. This can require complex integration to disparate systems. In addition, BPM tools that concentrate on the backend routing still require significant work to create the front end that users need to interact with the work flowing through the BPM processes.

An industry goal has been to build a tool that can be used to implement BPM solutions without programmers. Industry analysts have encouraged this goal as a key element for any industry leading BPM tool.  However, this approach tends to favor relatively simple BPM solutions.  Otherwise, the complexity of the configuration becomes tantamount to programming.  Likewise, the approach tends to obscure the fact that for any reasonably complex system, one still needs to practice the system implementation discipline and processes needed to build any system.  You cannot skip design, testing, and release control and expect to produce a complex system that works.

So while these BPM tools may make the actual “programming” simpler through custom configuration tools and utilities, most of the work required to implement and maintain a complex solution remain.  As systems become more complex, configuring these tools can be as difficult as some programming, but the tools require specialized training and often do not have the debugging and other utilities that have made programming more efficient over the years.  In addition, in their quest for simplicity, some tools limit the ability to customize the system to meet the needed requirements.  These limitations are usually not apparent until you try to implement a specific requirement.

The Generic Out-of-the-Box BPM tool

The goal of the typical BPM product today is to provide a tool that allows a company to implement BPM solutions quickly and efficiently with little to no custom programming.  To be successful in these goals, BPM product vendors pick specific segments or dimensions of the BPM market to which they orient their product.  With sufficient customization, any tool can be used as a basis for any class of BPM solution.  But, out of the box, different tools tend to be best suited for specific solution areas as characterized by the dimensions of a BPM solution described above.

Over the last few years, these trends in the market have tended to move many BPM products towards ad-hoc, lower volume type BPM applications.  These applications are characterized by individual in-baskets, more ad hoc routing, simple browse type interfaces for users to select work, and lower volumes of work.  The typical user can easily browse their current list of assigned work due to the limited number of items.  User interfaces are fairly simple, and interactions with in-house custom systems are limited.

This type of BPM system also fits many people’s initial concept of what an automated BPM will look like.  That is, a system that gives everyone a personal electronic in-basket, a means to view and act upon work items in their in-basket, and the capability to route the work items to other users.  This is a system well suited to knowledge workers where less emphasis is placed on standardization of work, and more emphasis is placed on flexibility.  An emerging class of case processing tools fits into to this segment of the market.

These tools also highlight the ability to rapidly make changes to the routing paths and rules to quickly modify the workflow. While there is a class of BPM solution where this makes sense, just because it is possible to make changes quickly and easily to a BPM environment does not mean that an organization should make frequent changes. If the process involves a significant number of users, any changes will be disruptive to the organization. Changes that are not carefully thought out before implementation may unnecessarily add cost to the operation as opposed to increasing efficiency. Likewise, changes that are not thoroughly tested before being introduced to a production environment will likely disrupt work and cause unnecessary downtime for the organization. In the case of BPM where users often cannot work at all if the system is down, this can be very expensive.

Acknowledgements

This blog post is an excerpt from Peter Gretz’s white paper, “Effectively Applying BPM to High Volume, Human-Centric, Production Operations.”

Check out part two of the series, BPM as a solution.

]]>
https://blogs.perficient.com/2016/10/31/business-process-management-evolution-part-1-bpm-as-a-tool/feed/ 0 196257