Perficient Blogs https://blogs.perficient.com Expert Insights Wed, 26 Jun 2019 17:17:24 +0000 en-US hourly 1 https://i2.wp.com/blogs.perficient.com/files/2019/04/cropped-Perficient_favicon_512x512.png?fit=32%2C32&ssl=1 Perficient Blogs https://blogs.perficient.com 32 32 30508587 Oracle Simplifies Branding and Packaging of its Analytics Product https://blogs.perficient.com/2019/06/26/oracle-simplifies-branding-and-packaging-of-its-analytics-product/ https://blogs.perficient.com/2019/06/26/oracle-simplifies-branding-and-packaging-of-its-analytics-product/#respond Wed, 26 Jun 2019 14:49:06 +0000 https://blogs.perficient.com/?p=241462 Oracle simplifies its cloud, on-premises and prebuilt analytics applications under a single brand ‘Oracle Analytics’. At the analytics summit this week, Oracle announced a new customer-centric vision for its Analytics Product. Under the new product direction, 18+ products which include Oracle Analytics Cloud, on-premise platforms like OBIEE, Oracle BI apps among others will be consolidated under a single brand ‘Oracle Analytics’.

There are essentially only three products for customers to choose from:

  1. Oracle Analytics Cloud – For customers considering cloud analytics platform and looking to leverage AI-powered, self-service analytics functionality for data preparation, visualization, enterprise reporting, augmented analysis, and natural language processing/generation. Oracle Analytics cloud would be the platform to chose. Autonomous Data Warehouse along with Oracle Data Integration platform would allow Customers to build standalone Analytics platform as well as a pop-up data mart for business users.
  1. Oracle Analytics Server – Oracle recognizes that there are several on-premise Oracle Business intelligence customers on OBIEE who are not considering migrating to cloud and would like to leverage augmented analytics capabilities. Those customers will now be able to take advantage of augmented analytics and world-class data discovery capabilities as part of Oracle Analytics Server, with no additional cost for the upgrade.
  1. Oracle Analytics for Applications – Oracle BI Applications, the prebuilt analytics for Oracle ERP customers was adopted by a large number of Oracle ERP customers. Which was also a very successful analytics product for Oracle. Based on that successful strategy, as a part of the Oracle Analytics for applications, Oracle will roll-out personalized business application analytics including benchmarks and machine learning-driven predictive insights. All line-of-business users gain real-time access to prebuilt, auto-generated business content and insights. Starting with solutions for Fusion ERP then expanding to Fusion HCM, SCM, CX, and NetSuite.

Also, with the new pricing strategy, it will no longer be a barrier for customers considering Oracle Analytics Platform.  Customers will have an option to chose by the user (Professional) or OCPU (Enterprise). Both prices are competitive with low cost and leading analytics vendors. I believe the new strategy will allow Oracle to retain its existing OBIEE, Oracle BI Apps customer base but also take some share from other leading vendors. The simpler product range in this new direction will make it easier for customers to choose from and will likely lead to more adoption and sales for Oracle Analytics.

]]>
https://blogs.perficient.com/2019/06/26/oracle-simplifies-branding-and-packaging-of-its-analytics-product/feed/ 0 241462
3 things I want to explore more on Groovy with EPBCS API-KSCOPE19 https://blogs.perficient.com/2019/06/25/3-things-i-want-explore-more-about-groovy-epbcs-api-kscope19/ https://blogs.perficient.com/2019/06/25/3-things-i-want-explore-more-about-groovy-epbcs-api-kscope19/#respond Tue, 25 Jun 2019 21:15:19 +0000 https://blogs.perficient.com/?p=241429 Kscope19, ODTUG’s user conference known for providing a rich opportunity for networking and education on Oracle technologies, is taking place this week in Seattle!

I was fortunate enough to attend the session”Getting Started with Groovy for the Non-Technical Superstars” with presenter Kyle Goodfriend, Vice President of Oracle Practice – Accelytics who introduced how to get started with Groovy.

EPBCS API has a lot of functions that can be utilized to perform tasks within Planning. Groovy is the language to interact with such functions and to execute them.

Some helpful things that I learned from attending this KSCOPE19 session about Groovy are:

  • Groovy scripting can change the way form validations are done.  We could get real-time warnings focusing only on those yellow edited cells on the form by making Groovy focus on that specific set of data.  Focusing validation on the yellow edited cells improve a lot of performance and you could get many validations incorporated instead of keeping them out of the form to improve performance.
  • I learned that using Grid iteration helps to pass edited cells to the data maps or Essbase calculation scripts and check validation on the input data based on that hence you could get performance improved data validation
  • You can use Groovy scripting to add more information to the default log messages that the system generated on the JOBS screen after executing a job or a rule.
  • It is also possible to sync data from one data input form to multiple BSO or ASO database within the same application. This could cut the need for more process and also reduce the number of calculation scripts or jobs used to sync multiple apps in one go.

If you are wondering where to begin, some of the attached links to the documentation can help and you have to choose Groovy scripting instead of Calculation Script in the edit mode to start writing a groovy script in EPBCS. The editor may not be that friendly to indicate Groovy syntax issues.

 

Here is further documentation on Groovy:

Groovy and SmartPush

Documentation on API

 

]]>
https://blogs.perficient.com/2019/06/25/3-things-i-want-explore-more-about-groovy-epbcs-api-kscope19/feed/ 0 241429
Northwell Health Overcomes Self-Service Analytics Challenges https://blogs.perficient.com/2019/06/25/northwell-health-overcomes-self-service-analtyics-challenges/ https://blogs.perficient.com/2019/06/25/northwell-health-overcomes-self-service-analtyics-challenges/#respond Tue, 25 Jun 2019 19:15:26 +0000 https://blogs.perficient.com/?p=241190 Perficient is excited to share the on-demand recording of our HIMSS 2019 educational session presentation. The session showcases a comprehensive analytics and data strategy solution developed in partnership with Northwell Health. Northwell Health is the 14th largest health system in the U.S. and the largest in New York. The session is now available on-demand below.

View the session recording to learn about tools, processes, education. In addition, learn about measures for enabling information and discovery analysis across clinical, financial and operational functions. 

Session Overview

Northwell Health was able to leverage participation in New York State’s Medicaid reform program to develop the infrastructure for integrating its clinical, claims and financial data into a single repository.

As a result, Northwell Health was able to enhance its performance measurement and gain data efficiencies around nearly 10 million patients, 65 million interactions, leading to improved accuracy in its reporting and analytics.

During the presentation, speakers Jim Kouba, Health Solutions Director, Perficient and Chris Hutchins, Associate Vice President, Healthcare Analytics, Northwell Health discuss:

  • Key techniques to build partnerships and establish a strong team approach between business and information and technology across the organization.
  • Best practices for providers wanting to build self-service analytics programs.
  • The importance of establishing a clinical enterprise data warehouse; providing a single view of truth that enables Medicaid payment reform program participation.

Related posts

]]>
https://blogs.perficient.com/2019/06/25/northwell-health-overcomes-self-service-analtyics-challenges/feed/ 0 241190
Creating Holistic Digital Experiences for Healthcare Events pt. 2 https://blogs.perficient.com/2019/06/25/creating-holistic-digital-experiences-healthcare-events-pt-2/ https://blogs.perficient.com/2019/06/25/creating-holistic-digital-experiences-healthcare-events-pt-2/#respond Tue, 25 Jun 2019 13:29:00 +0000 https://blogs.perficient.com/?p=241119 Previously, I discussed the benefits of having a centralized place on your organization’s flagship healthcare website to create awareness and promote events via digital methods, understand the importance of having a holistic end-to-end events digital experience that aligns with the rest of your digital footprint, as well as, provide an overview of the events tool market. This post will identify the top players to consider for creating holistic digital experiences and key criteria to use for an events tool assessment to help you find the right vendor for the events digital experience.

There are a number of digital tools available to create holistic digital experiences for healthcare events offering full-spectrum support for calendars with search and filter capability, registration and payment methods, as well as event promotion. This second post examines the top players to consider for creating a complete end to end digital experience for healthcare events and provides a set of key criteria to assist you in identifying a set of tool requirements and the right software provider for your needs.

Top Players in the Market

The events management, registration and marketing tools industry has a small number of software providers that offer a comprehensive end to end solution. These top players include Aventri, Bizzabo, Cvent, Eventbank, Eventbrite, Eventzilla, and Trumba. They share a common set of features that include event registration and payment, customization of templates with various branding options, e-invitations for event promotions, social media integration, as well as reporting or analytics tool integration. Where they start to differentiate is where the value-added extras come in for general event planning (such as agenda builder, name badges, attendee check-in, etc.) as well as the type of solutions offered (e.g. white label domain events website, embedded widgets for your flagship website, listing events on vendor community site, API feed, or mobile apps). Pricing across the software providers greatly varies according to volumes of events and attendees in addition to the full range of features needed for your solution.

Bizzabo offers some of the highest levels of customizations for their solution that includes a white label domain site with an easy drag and drop website builder that has won awards for back-end ease of use. Cvent offers the strongest analytics reporting of all software providers that are built into its platform as well as Google Analytics integration. Eventzilla offers the greatest flexibility with an API that is available for you to create your own custom solution. Trumba offers solutions that can be integrated directly into your website through the use of an embedded widget so users never leave your site throughout the experience. They support a high level of customizations to ensure the digital experience looks consistent with the rest of your website and also have exceptional customer service support.

With a number of players in the market that offer strong solutions for creating holistic digital experiences, it does come down finding the right tool that meets your organization’s digital experience needs for healthcare events.

Developing Business Requirements

Initially, start by conducting an events audit on the public facing website and on the intranet. This way you’ll know how many events in total are listed, what type of events they are (e.g. for a specific clinical or administrative area), what current tools are being used for event listing and registration, and whether you require a payment method. To obtain pricing from software providers you will need to know how many events you have annually in addition to a breakdown of how many free and paid attendees.

With these details, you can then move onto answering a couple of key questions to assist you in scoping the solution. Of the departments/clinical or administrative areas that regularly host events, how many require their own dedicated calendar? Is a calendar needed on the intranet in addition to the public facing website? The number of calendars you need will form a key part of your conversation with prospective software providers to ensure the tools meet your very basic requirements.

With this lay of the land understanding of the healthcare events at your organization, it’s now time to start gathering detailed requirements for the events digital experience so you know what the final solution needs to be and to assist with tool selection. It’s worthwhile to take the time to work through the steps below:

  1. Learn about existing tool capabilities so they can inform requirements
  2. Conduct a series of sessions with power users of the existing tool(s) to learn about the pain points and what is working well from an authoring perspective, this should also inform requirements
  3. Schedule sessions with departments that host a large number of events to understand any future changes they anticipate in their needs
  4. Meet with Marketing and IT to understand what the current integrations are for the tools being used and existing event promotion

With the detailed requirements in place, you can start to review various software providers in order to find one that best meets your needs.

Tool Assessment Criteria

Meet with software providers to see and learn about their solutions that align with your needs is a key next step. The criteria below will help you in your discussions with tool providers to ensure you have a solution that meets your specific needs as well as food for thought following a tool provider meeting to help you decipher whether they are a good fit for your needs.

What level of customization of the experience is needed compared to what is offered as out of the box by the software provider? Is there enough branding, opportunity for consistent design styling, or customizable questions to ask regarding medical history?

  • What search and filter functionality is available? What are the restrictions on user functionality?
  • Is there a method for accepting payments? How does payment work? Who is the preferred payment gateway provider?
  • What do the authoring and publishing workflows look like? What’s the ease of use of the interface to set up/edit events, registration forms, promotional emails, etc.?
  • Which level of analytics is provided to measure success? Are any integrations available with analytics platforms?
  • Do they offer additional integrations that might be needed with CRM and/or Marketing?
  • What’s the pricing model? Are there restrictions for a number of events, attendees, admin logins, etc.?
  • Is the solution mobile friendly?

Ensure tool demonstrations with software providers are part of your dialogue with them so they will be key in helping you and your team ask questions and ensure the solutions meet your needs. It will also assist in the development and creative teams in understanding what is possible and estimate the level of work involved. From this level of insight, it will be clear which software provider is the best to proceed with.

If the new solution requires migrating from an existing tool(s) to a new centralized tool, various clinical or administrative departments across your organization that host events frequently will need a transition plan of when the switchover will take place. They will also need training and support on the new tool for authoring events.

I hope this post has been informative in helping you understand the key players in the events management, registration, and marketing industry as well as guidance on tool assessment criteria in making that final decision on tool selection. I would love to hear back on how you’ve navigated this space and created holistic digital experiences for your healthcare events. Please note, the third and final post will offer guidance on features to include when designing/redesigning the digital experience for healthcare events and how to create a personalized experience for website users.

]]>
https://blogs.perficient.com/2019/06/25/creating-holistic-digital-experiences-healthcare-events-pt-2/feed/ 0 241119
Driving Behavior Change Leads to Accelerated Value https://blogs.perficient.com/2019/06/25/driving-behavior-change-leads-value/ https://blogs.perficient.com/2019/06/25/driving-behavior-change-leads-value/#respond Tue, 25 Jun 2019 13:12:41 +0000 https://blogs.perficient.com/?p=238319 This blog series examines how change management can and should pay for your next project. My past post, explored why some organizations don’t immediately realize the value of a business transformation. In this post I’m going describe how behavior change leads to value.

Effective organizational change management is a complex undertaking, and leaders often fail to appreciate just how difficult it is. They sometimes make the mistake of believing that it is enough to have the right strategy and to communicate it clearly. To be sure, that is a necessary starting point. Employees need to understand and accept the change and its impact on them since everyone in a business transformation, not just the leaders, must have their oars in the water. However, simply understanding the change is not enough.

More challenging is making sure employees have the requisite skills to execute against the new processes and technologies that bring transformation. After all, the essential element of any transformation is changing employee behavior, especially at the point of implementation. This is no easy task, and more often than not, the anticipated benefits don’t immediately materialize.

If anything, those in the change management field often warn leaders to expect performance to decline upon implementation. Why? Because change efforts require employees to do something different – something that is initially harder, that makes them work longer, and that makes them feels less comfortable.

To overcome these obstacles and accelerate value, leaders need to reorient their focus from the strategic, somewhat amorphous, concept of “managing change” to the tactical concept of “changing behavior.” But how do they do this?

For more insight on how OCM can (and should) pay for your next project, download our guide here or below.

]]>
https://blogs.perficient.com/2019/06/25/driving-behavior-change-leads-value/feed/ 0 238319
How to Build a Winning Data Platform https://blogs.perficient.com/2019/06/25/build-a-data-platform/ https://blogs.perficient.com/2019/06/25/build-a-data-platform/#respond Tue, 25 Jun 2019 11:00:02 +0000 https://blogs.perficient.com/?p=241032 Recently, at Informatica World 2019, I heard the importance of data platform in building AI capabilities for the organization. What is interesting is that Informatica, known for their products delivering the “Switzerland of Data”, is now using AI capabilities to enhance their own suite of products with CLAIRE capabilities. In further exploring a few other articles on the importance of Data, I also came across Monica Rogati’s Data Science Hierarchy of Needs and was impressed by the way she relates the AI structure to Maslow’s Hierarchy of Needs.

In a way, the “self-actualization” that Maslow defines as “achieving one’s full potential” is the AI capability. However, to get there, you need the basics of data platform foundation. Now an important distinction between Monica Rogati’s Data Science Hierarchy and my pyramid structure is the assumption that you would use the capabilities from software products such as Informatica which offers you GUI-based capabilities where you can focus more time on governance, analysis, and quality and less time on writing custom coding. So please consider that as you are reading this article.

Data Platform Model

Data Platform Path

FIND
It’s paramount to identify and clearly define the “use case” that the AI team is going after. Without a meaningful use case, just building machine learning and automation for the sake of exploration doesn’t provide any value. Once the use case is defined, find where the data resides in the enterprise or outside the enterprise (benchmark, 3rd party, etc.)

COLLECT
With commercial and open source tools available in the data marketplace, you can quickly build data integration to collect real-time or batch data into a data lake. Don’t overthink quality of data at this point.

UNDERSTAND
Once you collect data into a data lake, understand the data you collected by profiling the datasets and mapping them back to your use case. You can also define tags in your data to put a business context of your datasets. In addition, take effort to classify the data you collected into categories that make meaningful business sense.

INTEGRATE & TRANSFORM
Once you tag and classify your datasets, integrate data from multiple sources into one data model that can support your defined use cases. In some cases, this can also be enhancement of your existing data model to support multiple use cases.

ENRICH
Integration should also include data enrichment. So many open datasets such as weather, traffic patterns, currency, disaster, health conditions are available for the public to consume. In addition, third party datasets such as Dun & Bradstreet can help validate customer addresses.

SCALE
It’s clear that to integrate such large, disparate datasets and build data models out of those datasets, your cloud or on premise data platform should be able to perform at scale. So use performance tuning and storage/compute techniques that will provide on-time results.

EXPERIENCE
Good quality data doesn’t mean anything without showing results in a format that can be consumed by different audience levels (line level to executives). Reporting platforms such as Power BI, Tableau, and Microstrategy have been market leaders for a reason with their ability to build beautiful visualizations with streaming or large batch datasets. Hence large cloud vendors such as Salesforce have been acquiring BI companies like Tableau to enhance their visualization. Visualization for your data platform using BI software like MicroStrategy, Tableau, Power BI, etc.

Defining Metrics

One other important factor is to define the metrics and measures clearly to take actions based on facts.

MONITOR
Building the data platform is not a one time activity. Data similar to infrastructure needs continuous monitoring and improvement based on feedback from business subject matter experts (SME) who also act as data SMEs. Therefore, as you build your data platform use monitoring services and build notifications and alerts based on thresholds driven by business needs. Additionally, you can rate your data based on the relevance of the datasets to your decision making process. This will improve the quality of the data that is important for the organization. This activity will also improve prioritizing critical datasets over others similar to putting tighter SLA’s on important systems and their recovery procedures.

AI & DEEP LEARN
All the steps above will lead into building Machine Learning algorithms and automation processes that will provide relevant opportunities and direct impact to your organization’s bottom line.

While the above sequence of events will manage your data throughout the lifecycle of data preparation, data security and data governance play a key role to manage the data lifecycle as well. In addition, Dev Ops will provide agility to building data platform to keep the business moving and changing as mergers and acquisitions dominate the current landscape.

]]>
https://blogs.perficient.com/2019/06/25/build-a-data-platform/feed/ 0 241032
4 Strategies to Make CX a Competitive Differentiator https://blogs.perficient.com/2019/06/24/4-strategies-to-make-cx-a-competitive-differentiator/ https://blogs.perficient.com/2019/06/24/4-strategies-to-make-cx-a-competitive-differentiator/#respond Mon, 24 Jun 2019 20:03:35 +0000 https://blogs.perficient.com/?p=239928 I have two pieces of old news to report. First, the world is changing faster than ever. The second piece of old news: customers are more empowered than ever and have almost complete control over their own journey. So, what is the real news today? These two well-established trends have joined forces to push companies to think way beyond the already challenging task of keeping up a great a great product or service. The opportunity is in the complete customer experience, and the bonus goes to who can anticipate their customers’ unmet needs and expectations. As they say, “what got you here won’t keep you here.”

Still, I’m seeing some resistance and even some denial from within some of our clients. Customer-centricity can challenge the internal operational mindset of a company. Sometimes the customer experience is thought of as something the contact center has to deal with down the line. Other times, the marketing organization is in the best position to advocate for the customer, but the rest of the organization sees them as simply brand and promotion. Rallying the organization around customer experience (CX) can be challenging. We share a few of the rally cries that are working.

4 Ways to Excel at CX:

1. Have Empathy for Your Customers

Establishing customer empathy means deeply understanding who your customer is, how they think, what their needs are, and how what you do impacts them. Customer research is a tried and true tool for empathy-building, but too often it’s only used when launch a new product or a transformational strategy. Your company may have big transformational ambitions but shouldn’t forget to address the “now.” It’s more expensive to gain a new customer than to retain an existing one, so balancing the now, the new, and the next comes from the insights from ongoing empathy-building.

2. Cater to the Entire Customer Journey

A client of ours, a leading athletic wear company, recently ranked No. 1 in Total Retail’s list of top omni-channel retailers. As I wrote in a previous blog post, the company “created a unified experience that allows customers to engage with the brand however, whenever, and wherever they want.” Truly understanding and improving the customer journey requires more than empathy. Today’s journeys are nonlinear and can start and end anywhere. That’s why it’s important to consider the entire journey across all channels. Being prepared to show how customer success requires coordination from different departments is a good foundational move.

3. Strike the Right Balance Behind the Scenes

Anyone can make shiny objects. However, your great ideas need to actually work and to actually drive business. It’s not just about what’s next. It’s also making sure what’s now or new to you also gets attention because customers’ expectations are continuously evolving and constantly raising the bar. Don’t ignore one end of the innovation spectrum at the expense of the other, by, say, over investing in technology and under investing in operations.

4. Be Adaptable to Change

Building strategy is an attempt at predicting the future and making smart choices with scarce resources, funding, and time. But the reality is that you have to be prepared to make ongoing, risk-tolerable decisions and be ready react to changes. And because customer expectations (and the world) are changing all the time, the cornerstone of a strategy is not about prediction, but about agility and change.


Want More Digital Transformation Advice?

The digital transformation strategies I share in this blog post draw from Perficient’s e-book, “How to Make Digital Transformation Gains in 2019.” In it, my fellow Perficient Chief Strategists and I share real-world examples from conversations with today’s leading brands at various stages of digital transformation. Our 10-chapter e-book features our business insights, actions to take now, and client success stories. Download it here or via the form below.

Next in the Series

This blog series is part of a special series inspired by our e-book. In the next post, Perficient Chief Strategist Scott Albahary will share tips for pinpointing your value proposition.

Subscribe to our Digital Transformation weekly digest here to get the blog posts automatically delivered to your inbox every week. Or, follow our Digital Transformation blog for this series and advice on the topic from all of our thought leaders.


About the Author

Jim Hertzfeld leads the Strategy and Innovation team for Perficient Digital, providing customer experience insights, ideation, and investment strategies for Perficient’s digital solutions. Jim co-founded the Digital Strategy Group for Meritage Technologies in 2000 that was acquired by Perficient in 2004. He also authored Perficient’s Envision strategy methodology in 2005, which has resulted in a number of client engagements and established new client relationships focused on digital strategy and customer experience.

]]>
https://blogs.perficient.com/2019/06/24/4-strategies-to-make-cx-a-competitive-differentiator/feed/ 0 239928
Taking Advantage of Streamlined Sales Tax Agreement https://blogs.perficient.com/2019/06/24/taking-advantage-of-streamlined-sales-tax-agreement/ https://blogs.perficient.com/2019/06/24/taking-advantage-of-streamlined-sales-tax-agreement/#respond Mon, 24 Jun 2019 17:56:00 +0000 https://blogs.perficient.com/?p=241341 Editor’s Note: This guest blog post comes courtesy of Jeff Stanton with Avalara.

In 1999, The Streamlined Sales and Use Tax (SST) Agreement now has 24 member states, though many businesses are still not clear on how it works and how it could improve their sales tax compliance efficiency. Companies registered and collecting sales tax in multiple states will likely benefit from exploring SST. Any business that qualifies as a volunteer seller in Streamlined Sales Tax (SST) states can use SST a Certified Service Provider (CSP) to register, calculate sales tax, and file returns at no cost. Now for the best part, Avalara is an SST Certified Solution Provider!

Tell me more – what is SST?

SST resulted from 44 states, DC, and local business communities coming together to simplify the collection and reporting of sales tax for remote sales. This happened after the Supreme Court of the United States ruled twice — in National Bellas Hess v. Illinois (1967) and Quill Corp. v. North Dakota (1992) — that state and local sales tax compliance was too complicated to inflict on businesses based out of state with no physical presence in the state.

The resulting Streamlined Sales and Use Tax Agreement makes sales tax administration in SST states less costly and burdensome for businesses who transact in member states.

Because the processed to be approved is rigorous, not all of the original 44 states who conceived of SST are currently full members. The following are full member states: Arkansas, Georgia, Indiana, Iowa, Kansas, Kentucky, Michigan, Minnesota, Nebraska, Nevada, New Jersey, North Carolina, North Dakota, Ohio, Oklahoma, Rhode Island, South Dakota, Utah, Vermont, Washington, West Virginia, Wisconsin, and Wyoming. Tennessee is an associate member state. This list is expected to grow in the coming years.

Streamlined Sales Tax (SST). Why Now?

It’s been one year since the Supreme Court overruled Quill’s physical presence rule in South Dakota v. Wayfair, Inc. The resulting change to the sales tax landscape – threw many companies for a loop. Moving forward, states have the authority to require sellers with no physical presence in a state to collect and remit sales tax based on their economic activity in that state, otherwise known as economic nexus. Previously, having a physical presence in a state was the only criteria that triggered a sales tax collection obligation (physical nexus).

As of June 2019, 37 states — including all but one of the SST member states — have adopted economic nexus laws; each already requires, or soon will require, out-of-state sellers who exceed defined thresholds on number of transactions or total amount of goods sold, to collect sales tax. Remote sellers who do business across numerous states have had to re-assess not only where they have nexus, but also how they manage their tax compliance.

See the growing list of states that impose a sales tax collection obligation on remote sellers.

Registering for SST – which can be done through the SST website –  makes sales tax compliance in participating states more efficient and less burdensome, especially for businesses with a high volume of sales into multiple SST states. The Wayfair ruling even listed South Dakota’s membership in SST as one of three reasons* South Dakota’s economic nexus law isn’t an undue burden on remote sellers.

Additionally, businesses that qualify as volunteer sellers, explained in more detail below, can avoid the standard fees for registration, transactions, and filing in SST member states when they use a Certified Service Provider (CSP) like Avalara.

SST State Requirements an

SST member states are required to have:

  • A central, electronic registration system
  • Consumer privacy protection
  • Simplified administration of exemptions
  • Simplified state and local tax rates
  • Simplified tax remittances and returns
  • State administration of sales and use tax collections (no self-collecting local jurisdictions)
  • Uniform state and local tax bases
  • Uniform sourcing rules for all taxable transactions
  • Uniform tax base definitions and rules

SST company requirements

To qualify as a volunteer seller in a member state and obtain CSP services at no cost, your business must meet all the following criteria during the 12-month period immediately preceding the date of registration with the member state:

  • No fixed place of business for more than 30 days in the state
  • Less than $50,000 of property in the member state
  • Less than $50,000 of payroll in the state
  • Less than 25 percent of total property or payroll in the state
  • Additional criteria

Having economic nexus in a state doesn’t automatically disqualify you from obtaining volunteer status. If you do not meet all of the criteria above, you will have a non-volunteer status and may need to pay for portions of CSP services.

Any business may register through the SST website to receive the standard benefits listed above — simplified registration and more uniform rules and regulations — in any or all 24 member states. Qualifying businesses may also receive amnesty in select states (subject to limitations).

Additional benefits for volunteer sellers include:

  • No SST registration fees in participating states
  • No calculation fees in participating states
  • No monthly filing fees in participating states
  • Audit protection in participating states

Outsource sales tax management to a CSP

The automation of sales tax compliance facilitates sales tax management for any business required to collect sales tax in multiple states. In SST states, there are added benefits to working with a CSP, both for non-volunteer and volunteer sellers.

As a CSP — one of the first certified by SST — Avalara must meet rigorous standards for data processing and management of sales tax information. We can help you with all aspects of sales tax compliance, from determining where you have sales tax nexus and thus an obligation to collect sales tax, to audit response.

Learn more about working with Avalara in SST states.

*South Dakota law also “affords small merchants a reasonable degree of protection” by providing an exception for sellers with less than $100,000 in sales or fewer than 200 transactions in the state in the current or previous calendar year, and it prohibits retroactive application of its economic nexus law.

 

]]>
https://blogs.perficient.com/2019/06/24/taking-advantage-of-streamlined-sales-tax-agreement/feed/ 0 241341
Considerations for Building Cloud-Native Applications https://blogs.perficient.com/2019/06/24/considerations-building-native-cloud-applications/ https://blogs.perficient.com/2019/06/24/considerations-building-native-cloud-applications/#respond Mon, 24 Jun 2019 13:16:40 +0000 https://blogs.perficient.com/?p=241052 This is the next installment in a series of blogs on the subject of cloud transformation and building cloud-native applications.

In Gartner’s report “Top Emerging Trends for Cloud-Native Infrastructure” published May of 2019, they say “…leaders are keen to invest in cloud-native infrastructure technologies to increase software velocity; enable developer agility, application scalability and resilience; and reduce technical debt.” The report also states that while “leaders are keen to exploit cloud-native infrastructure technologies, production deployments are still constrained by a skills gap and lack of technical know-how”. Most companies see clear benefits to moving to cloud-native technology and applications but many lack the skills needed to get there.

The capabilities for building greenfield cloud applications has changed dramatically over the past decade. AWS launched in 2006 with three services and has over 165 services today. Netflix began their move from a monolithic architecture to an AWS cloud-based microservices architecture in 2009. Docker was released to open source in 2013 followed by Kubernetes in 2015. And, AWS Lambda introduced serverless computing to the public cloud in 2014. The pace of innovation in the cloud space is staggering. While many companies have not yet built out capabilities for containers, we are facing a new wave of innovation and application implications for the serverless approach.

Given that most companies want to adopt cloud-native and have a cloud skills gap, it is beneficial to manage a cloud-native transition as tracks within a program. For the cloud-native transition program, maturity targets, training programs, and technology acquisitions can be managed within tracks – for example as follows:

Past blogs in this series have address the need for cloud migration planning, reference architecture and guidelines, and organizational structure and governance.

At a high-level the following steps address the needs for a transition to native-cloud development:

  • Establish your container and serverless platform strategy
  • Create a cloud skills development strategy
  • Select and adopt the right tools for your environment and culture
  • Create lightweight architecture guidelines and best practices
  • Start small, measure and gauge effectiveness and adjust for emergent patterns
  • Embrace and evolve DevOps capabilities including CI/CD automation and application monitoring
  • Build automation, security, and compliance into your application and CI/CD lifecycle
  • Architect applications as collections of microservices running in containers or serverless
  • Selectively migrate monolithic applications to the cloud

Developer’s guidelines for building and deploying their cloud-native applications should be lightweight with a focus on developer experience and Agility.

The following guidelines would be helpful for developers making a transition to cloud-native development:

  • 12-factor app guidelines – customized to the cloud environment
  • Language and framework standards – e.g. Spring Boot
  • Platform specific best practices – e.g. AWS, Azure, GCP, PCF, Kubernetes, Docker, and OpenShift
  • Security implementation guidelines, network, scan automation, patch
  • DevOps – provision, deploy, CI/CD guidelines
  • Runtime standards – service mesh, application performance monitoring, logging
  • Migration guidance – when, why and how to migrate legacy monoliths
  • Microservices reference architecture and design guidelines

Perficient has teams of highly experienced cloud strategists, architects, DevOps, and change management experts should you need any help with your cloud-native transition. We have invested in developing cloud reference architecture and training experienced individuals on the latest development approaches including native cloud development, PaaS, DevOps, microservices and re-platforming.

]]>
https://blogs.perficient.com/2019/06/24/considerations-building-native-cloud-applications/feed/ 0 241052
Microsoft Teams Direct Routing Health Dashboard https://blogs.perficient.com/2019/06/22/microsoft-teams-direct-routing-health-dashboard/ https://blogs.perficient.com/2019/06/22/microsoft-teams-direct-routing-health-dashboard/#respond Sat, 22 Jun 2019 19:05:32 +0000 https://blogs.perficient.com/?p=241295 Not long ago I posted a blog series on Direct Routing, which I highly encourage you to check out if your organization will be going this route. To add on to this, Microsoft has recently released a Health Dashboard for Microsoft Teams Direct Routing which will allow you to monitor your connection between your SBC and the Direct Routing interface. In this article we’ll break down the capabilities of the Health Dashboard and show how useful this new feature can be for monitoring and troubleshooting your Direct Routing environment.

Health Dashboard for Direct Routing

With this latest update to the Microsoft Teams admin portal you can now monitor information about your:

  • SBC
  • Telephony Service
  • Network parameters between the SBC and the Direct Routing interface

With this new holistic view to Direct Routing monitoring you can quickly identify issues such as dropped calls and find the root cause (i.e. expired certificates or network issues). Microsoft has approached this monitoring in 2 different ways:

  1. Looking at overall health of the connected SBC’s
  2. Digging deeper into more detailed information about the connected SBC’s

Overall Health of connected SBC’s

To give you an example of what you can expect to see when in the Teams admin center, you notice int he screen shot below that the monitoring checks specific aspects like:

  • SBC
    • SBC FQDN
  • Network effectiveness
    • The measured ability of the network to deliver a call relative to calls sent vs calls delivered
      • Excludes call rejections by the far-end user (this is counted as a successful delivery)
      • Formula: NER (Network Effectiveness Ratio) = Answered calls + User busy +Ring no answer + Terminal reject seizures x 100
  • Average call duration
    • Call duration can often help monitor the quality of the calls
      • Shorter duration typically indicates quality issues
      • Calls less than <15 sec typically mean something in the call went wrong and the user was forced to terminate the call
  • TLS connectivity status
    • Shows the status of the TLS connection between the SBC and the Direct Routing Interface.
    • Checks for certificate expiration on the SBC
      • Gives warning when certificate is 30 days from expiring
  • SIP option status
    • Shows if there is an issue with SIP option flow. Also provides a detailed description of the errors
    • Values for SIP options status messages are as follows:
      • Active – Means SBC is active and the Direct Routing interface sees the options being delivered on a regular interval
      • Warning, no SIP options – This means that the SBC is discoverable in the environment and is configured to send SIP options, but the Direct Routing service is not receiving SIP options coming back from the SBC
      • Warning – SIP Messages aren’t configured – This means that the trunk monitoring for the SIP options is not enabled.
        • Microsoft Calling System uses SIP options + TLS handshake monitoring to detect the health of the SBC at an application level
        • If the trunk can be reached when pinging it, but the SBC certificate has expired or SIP stack doesn’t work properly, then Microsoft recommends enabling sending SIP options.
  • Concurrent call capacity
    •  Calculates how many calls were sent/received by Direct Routing
      • Use the New-CsOnlinePSTNGateway/ Set-CsOnlinePSTNGateway command along with the -MaxConcurrentSessions parameter

Shows Health Dashboard statistics

Image provided by: https://docs.microsoft.com/en-us/microsoftteams/direct-routing-health-dashboard

But wait, that’s not all….! You can dig even deeper by looking at the detailed view which will provide information on a specific SBC (as seen in the image below).

Health dashboard SBC details

Image provided by: https://docs.microsoft.com/en-us/microsoftteams/direct-routing-health-dashboard

This detailed SBC view shows the following:

  • TLS Connectivity status – Same metric as “Overall Health” page
  • TLS Connectivity last status – Shows the last time when the SBC made a TLS connection to the Direct Routing service
  • SIP options status – same metric as “Overall Health” page
  • SIP options last checked – Shows the last time when SIP options were received
  • SBC status – Current status of that particular SBC
  • Concurrent call – Shows all concurrent calls being handled on that particular SBC
    • You can filter on a 7 day, 30 day, or 60 day basis
    • Provides metrics for inbound, outbound, and all streams
  • Network parameters – Shows all network parameters measured from the Direct Routing interface to the SBC.
    • The following metrics are measured:
      • Jitter – Measures the variation in network propagation delay time between 2 endpoints that use RTCP in miliseconds (ms)
      • Packet Loss – Measures packet arrival failure between 2 endpoints
      • Latency/RTT – Length of time it takes for a signal to be sent + the time it takes for an acknowledgement to be received. Consists of propagation times between 2 points of a signal
  • Network Effectiveness ratio (NER) – Same parameter that appears on the “Overall Health” page
    • Gives you the additional ability to filter data by time series or call direction

For a full breakdown of this new addition to the Teams admin center, you can checkout the official documentation here.

]]>
https://blogs.perficient.com/2019/06/22/microsoft-teams-direct-routing-health-dashboard/feed/ 0 241295