Perficient has many great partners that support our development and deployment of the best of breed solutions we provide for our clients. This post is one of them in a series that will highlight some of the products available from our partners. Today, I’ll be presenting K2 and and their Workflow and Forms Apps for SharePoint in the Cloud.
K2 Appit for SharePoint is a cloud-based platform that allows you to easily deliver workflow and forms apps for SharePoint 2013 and SharePoint Online, without code. Use it to help your people get more work done, at any time and from anywhere, with real-time information that enables smarter, faster decisions.
K2 APPIT FOR SHAREPOINT DELIVERS:
A SMARTER WAY TO WORK WITH SHAREPOINT
With Appit, you can deliver SharePoint-based workflows and forms that link on-premises and cloud-based systems, to give your users the information they need. No code required.
Build workflow apps for SharePoint documents and lists.
Build forms and workflows that combine SharePoint and line-of-business data.
To learn more about how K2 Appit for SharePoint can help you deliver workflow apps in all the places your people work, contact us. We’ll arrange a free demo.
Just as we gear up for a webinar focusing on the digital experience – scheduled for this Wednesday, more information here – our team has published a new Perficient Perspectives on a closely related topic. Perficient Perspectives are a series of informal Q&A with subject matter experts on hot technology topics and trends.
In the latest Perspective, Enhancing the Digital Customer Experience with Sitecore, experts from Perficient’s XD, Sitecore, and Portal, Web Content & Social practices teamed up to share their thoughts around why digital experiences and presentation are crucial factors in shaping customers’ perceptions of a company. Jason Maloney, Mark Gehman, and Michael Porter explain how Sitecore’s web content management system enables continuous improvement of a website to address heightened customer demands, and how Sitecore’s digital marketing creates an opportunity for marketers to deliver on their strategic goals, with tools that include engagement automation and analytics.
One quote that really stood out to me from Jason Maloney, on the need to create a digital experience through your website that fits your unique set of customers:
I’ve always thought of the digital experience, at least the brand’s .com, as the centerpiece of its customer experience. If you are in retail, it’s your biggest store. In automotive, it’s your largest dealership. You aren’t limited to stock on hand, location limitations or customer service representative bias. And you can test, learn and adapt in practically “real time” to meet your customer’s needs. That said, I think it’s critical to communicate the brand’s truths to your customers in all experiences. A website should not be the same for all brands because one set of customers is different from the other.
As mentioned, in conjunction with this new Perspective, we’ve got a great webinar lined up for Wednesday, August 27, at 1 p.m. CT, Using the Right Content Strategy to Create a Personalized Digital Experience. Jason, Mark and Michael will be sharing some key content strategies, best practices and ways to create content that keeps your users coming back. They’ll also discuss how tools like Sitecore can help drive the personalized digital experience.
Recently working with a client in which we installed the Sitecore Active Directory Module version 1.1 with a Sitecore 7.1 implementation. So after configuring the AD module in the client’s authoring environment, two issues existed. The first was we received a .NET error as follows -> [ArgumentException: Provider name cannot be null or empty.]. We wanted to include additional fields in Sitecore from AD such as telephone number. Once we reverted out profile configuration, we also realized that roles from AD were not being integrated.
So after a lot of trial and error, and a couple of rounds through Sitecore support, the culprit ended up being a space character. Our specific space character was in the organizational unit similar to OU=Corporate Users. So Sitecore support developed a quick patch to work around the issue. We installed the new assembly and made some configuration changes:
<add name=”ad” type=”LightLDAP.Support.SitecoreADProfileProviderFixed, Sitecore.Support.403508″ connectionStringName=”ad” … …/> – change the profile provider definition
<add name=”ad” type=”LightLDAP.Support.SitecoreADRoleProviderFixed, Sitecore.Support.403508″ connectionStringName=”ad” … …/> – change the role provider definition
If you run into such an issue in your implementation, don’t hesitate to contact Sitecore support to get the patch. Make reference to issue ID 417172.
Having just gone through the onboarding process myself, I believe that using Yammer to facilitate the onboarding process can help drive adoption among recent hires. In addition, using Yammer for onboarding will allow them to get a functional grasp of the organization much quicker. Recent hires will have the whole Yammer network to ask questions to as well as being able to see all the posts from before they were hired. In an environment where new hires are increasingly under pressure to add value quickly, these five tips could help make that a reality.
Different people have different ideas on whether or not groups of new hires, who were hired at different points, should have separate groups or one large onboarding group. In my mind, it’s better to have one large onboarding group that people can join when they are hired and leave when the information in the group no longer pertains to them. That way the information and knowledge generated by previous groups of new hires is still there for the most recent new hires. Everyone who joins a company are going to have the same questions coming in, and rather than have to answer the same questions over and over, it’s easier to simply answer the question once and have new employees refer to that post.
New employees might have never had any previous exposure to Yammer so it’s beneficial to encourage them to join your network’s “Yammer 101” group. This way they can learn about how to use Yammer while learning about your company at the same time! In addition to the “Yammer 101” group, also encourage new hires to join other groups that they are interested in. Even if a group is not particularly business related (one colleague told me about a “Selfie” group) it can help foster a sense of team spirit and some research suggests can help with employee retention.
When I was in school, I remember studying learning styles - “series of theories suggesting systematic differences in individuals’ natural or habitual pattern of acquiring and processing information in learning situations.” I was the always the Converger, very hands-on, figuring out things for myself, testing theories. For me, this started at an early age. I can remember being one of the first students in middle school to harness the power of the internet around 1992 – 1994. I remember discovering Lexus Nexis, Alta Vista, and later Yahoo to read academic papers and abstracts. Writings, facts, opinions, that just weren’t available in my school library, were now available on the computer. I learned how to draw information at my fingertips by using search engines. Even in the early days, this was way more informative than an old encyclopedia and way more fun!
Fast forward to my college years, my search engine skills continued to progress. As I learned C++, VB Script, and Java, I relied heavily on the internet for the most up to date information on techniques, theory, and examples. Books simply couldn’t keep up with the power of the internet and its ever growing database of information. It was a great way for me to learn and get through college; and it continues to be a very sharp tool in my tool belt today.
In this post, I’m going to show you a few search engine tricks so you can Bing your way to success!
1. Use Quotes to Find Exact Results
There’s a lot of interest around moving to the cloud, and specifically, SharePoint Online. Because of that, we’ve had several webinars over the summer that focus on SharePoint Online and SharePoint in a hybrid environment (you can view all our past Microsoft webinars here, beginning with the most recent).
Despite that interest, migrations can be a bit of a headache (or in some cases, a debilitating migraine). But, if you do your research and plan properly, the process can be a fairly smooth one – possibly even your last, since once in the cloud, you shouldn’t need to do intensive upgrades or migrations in the future.
Last week, we held another session around SharePoint Online, this time focusing on Best Practices for a Successful SharePoint Migration or Upgrade to the Cloud. My colleague, Jason Bell, a senior solution architect within our Microsoft practice, kicked off the webinar with the top reasons to move to SharePoint Online. Following this, he shared migration methodology, which includes your migration assessment, migration development, and the actual migration plan.
Next, Jason talked about the different migration approaches – manual, scripted, or the use of a third party tool like AvePoint, Metalogix, or Sharegate. He wrapped up with a discussion around secure cloud computing, including information rights management and the use of Office Web Apps.
With the rapidly evolving migration to the cloud SharePoint teams are faced with a new challenge: How do we develop and deploy for SharePoint Online?
If your feet have been firmly planted with on-premises development for SharePoint it can be a little daunting trying to move your process to the cloud. Where and how should we conduct development? How can we implement release through development, quality assurance and production?
This article aims to help you get started and is based upon the hands-on experience of working with SharePoint 2013 Online during the past 18 months.
Develop for the Service
Above all recommendations it is highly advisable to build new features for the service using SharePoint Online. Whether you are writing CSOM, customizing a Master Page or building an App you should do this for the service and not in a local (on-premises) development environment. SharePoint Online offers a very rich API which is very extensible but it can be extremely sobering to realize the feature you just spent the last few weeks building relies upon a feature not available Online. If you are developing features for both Online and On-Premises you can always bring things back on-premises later.
With a MSDN subscription developers can provision their own Office 365 tenant and begin development within a few minutes. How many hours would this have taken for the developer to build their own VM for on-premises development? If the developer does not have an MSDN subscription they could always use a trial tenant on a temporary basis or pay for a single user tenant for indefinite use. When provisioning any new tenant for development ensure that it is under the same license as QA and production (e.g. E3).
Once a developer is ready to deploy and review new features they can do this on a separate Development (Integration) tenant accessible to the team. This Development Environment is typically used for demonstrations of new features, in SCRUM Review meetings for example.
Consistent with any mature software development it is important to ensure that Development, QA and Production are properly isolated and permissions configured accordingly. Developers will most likely have full administrative access to Development but will only have read or less access to QA and Production. Keeping your developers out of Production is a key principle for stability and ensures good consistent deployment techniques are employed. It also ensures that we maintain healthy disagreements between developers and administrators which is as old as time and ensures the project is fun!
It helps to name tenants consistently. We usually use the convention:
A key consideration with this isolation is how to maintain accounts across all three environments. Most likely the Production environment will have federated identities synchronized to the cloud with ADFS and DirSync or FIM. This allows us to work with corporate credentials in Production. However, a single domain can only be synchronized to one Office 365 tenant. So what should be configured for Development and QA? It is of course possible to build new domains (on-premises) and mirror the synchronization for Production. This is of course the most pure form of ensuring Development and QA are true representations of Production. However, this may be overkill for your development and testing needs.
It can be advantageous to use cloud accounts (onmicrosoft.com) in Development and QA, they are extremely lightweight and easy to manage as your team grows. Cloud accounts are particularly useful when working with professional services organizations as setup can usually avoid what might otherwise be a lengthy setup process. However, if your solution relies heavily on synchronized identities then it may be necessary to have Development and QA domains which mirror production.
Another key driver for isolating tenants in this way is that it ensures no global configuration changes during development can impact the production system. Consider the configuration of:
One could argue that developing in a single Site Collection isolates development appropriately. However, the misconfiguration of these items alone could easily break a production system and take some time to recover from e.g. Search may need to re-crawl or the Content Type Hub will need to wait for a scheduled push.
This article will not fully elaborate upon Scripted Deployment to SharePoint Online I will write another article shortly on this topic. However, it is an important principle of this model. Automating any task which is repeated can be a productivity benefit providing the time invested in developing the automation takes less time than repeating the task itself. Automation also significantly reduces chance of human error. It is less obvious how to automate deployments for SharePoint Online but the benefits are clear and have paid huge dividend for our teams working with the service.
What is Scripted Deployment? For SharePoint Online this means writing PowerShell with XML configuration and using the SharePoint Server 2013 Client Components SDK (currently v16). The PowerShell is run locally on the developer or administrator’s machine but connects to SharePoint Online using the Client Object Model. Through this script we can deploy most things required for SharePoint Online customization such as:
It has taken some investment in the development of PowerShell modules but these become highly reusable across projects.
As developers work with their own tenant they develop the deployment scripts required for their feature. Those familiar with SCRUM will relate to ‘Done Criteria’. Our Done Criteria includes development of a feature and its scripted deployment to the Development (Integration) tenant where it can be reviewed. There are some exceptions which cannot be achieved by this technique but the Client Object Model does support a very wide range of common needs for deployment and configuration. Where exceptions exist these are documented in a deployment document for manual execution by an administrator.
Replication of Production Data
It is desirable to have recent data available in QA to ensure good and valid testing. For this replication it is advisable to use a third-party migration tool like Metalogix Content Matrix. When selecting a tool for this purpose ensure that it can migrate the data faithfully to ensure good testing but also that it can transform data as required. For example, if Production data uses synchronized identities but QA uses Cloud Accounts it will be necessary to perform some transformation. E.g.
email@example.com could be mapped to firstname.lastname@example.org
Happy development and deployment!
Over the months we have released a lot of information on building analytic platforms in healthcare. Several members of my team have played key architectural roles in not only implementing the Cogito platform and performing readmission analysis with it, but also expanding the platform to include customer satisfaction data from Press Ganey.
These functions were deemed critical to the initial phases of these projects, but are largely ‘back-end’ architectural projects. They do not address the ad-hoc analysis needs of the business, the delivery technologies available or much less the predictive capabilities that can be added to the platforms.
Fortunately there are a lot of new technologies in the Microsoft stack to address these needs.
As part of our advisory services to help our clients understand what new capabilities they have with their new platforms we regularly build concept visualizations. The following videos are examples of out of the box capabilities we built for one of our clients utilizing:
Self-service analytics with Power Pivot and Power View
3D visualizations with Power Map
And finally natural language query processing in the cloud with Q&A in Power BI
These technologies are well known and are being leveraged within several of our large clients, but a couple of recent announcements from Microsoft introduces even more exciting capabilities.
Power View now supports forecasting. This is a great new add currently available in the HTML5 version of Power View in Power BI. It gives the user the ability to quickly forecast a trend line, account for seasonality and even adjust the confidence intervals of the calculation. Below is a screenshot of some readmission forecasting being performed on the dataset from the earlier videos.
Important to note is that you not only see the forecasted line (light blue lines which runs through the top chart gray box) but the second chart also shows the hindcasting feature which lets a user start a forecast in the past in order to see how accurate it would have been against real data. (light blue line to the left of the gray box in the second chart).
While valuable and easy to use, this technology doesn’t give us the ability to predict who is at risk of readmitting. For that, we need a more powerful tool.
Azure Machine Learning Services is a recently announced cloud service for the budding Data Scientist. Through a drag and drop interface you can now build experiments of predictive models, train and score the models and even evaluate the accuracy of different algorithms within your model.
The screenshot below shows an experiment that was built against the same readmission data used in the forecasting example (Epic Cogito dataset). The dataset was modified to flatten multiple patient admissions onto one record and included the following attributes as well as some others:
The experiment was then created to compare two different classification algorithms, a boosted decision tree vs. a logistic regression. *Note that this blog is not intended to debate the accuracy or appropriate use of these particular algorithms. These were just the two I used.
Once the experiment is complete and evaluated a simple visual inspection shows the accuracy gains one algorithm has over the other.
After some tweaking (and this model still needs it) there is a simple process to create a web service with the associated API key which you can use to integrate the model into a readmission prediction application. One that accepts single record or batch inputs.
As you can see, there are a number of options for introducing advanced analytics into your healthcare environment. Feel free to contact me with questions on how these tools can be put to work in your new healthcare analytics platform.
As a marketer, the focus on engagement and shift to a more connected, digital experience is incredibly interesting to me. Not long ago, the online experience was fairly simple – you had a website, and you pointed your customers to that site. Your static content was adequate at the time.
Fast forward to 2014… what worked back when we partied like it’s 1999 (oh wait, it was) isn’t going to cut it today. According to Internet World Stats, as 2013 came to an end, there were over 2.8 billion people in the world online. In 2000, there were less than 400 million online. For the sake of comparison, the world population is around 7 billion today.
Needless to say, that’s a lot of eyes on your .com site. And on that site, we’ve got articles, blog posts, comments, infographics, audio, video, and images, to name a few. It’s easy to be hyper focused on content creation to the point that we lose sight of whether or not that content is even relevant to our target demographic.
Enter a content strategy. This will help you figure out which content type to use where, allowing you to both personalize and enrich your users’ digital experience. Join us on Wednesday, August 27, 2014 at 1 p.m. CT for a webinar, Using the Right Content Strategy to Create a Personalized Digital Experience to learn about some key content strategies, best practices and ways to create great content to keep your users coming back. We’ll also discuss how tools like Sitecore can help drive the personalized digital experience.
During the webinar, you’ll hear from Jason Maloney, Director of Perficient XD, Michael Porter, Principal of Portal, Web Content and Social Solutions at Perficient, and Mark Gehman, Perficient’s Sitecore Practice Director. Together, they’ll share a lot of actionable tips to get you started on creating or improving your content strategy.
To register for the webinar, click here.
Using the Right Content Strategy to Create a Personalized Digital Experience
Wednesday, August 27, 2014
1:00 p.m. CDT
Last week, Perficient’s Zach Handing wrote a post over on our Spark blog explaining what to make of the recent Internet Explorer announcement published on Microsoft’s Internet Explorer blog. In the article, Microsoft discussed their plans for supporting older versions of IE. There was quite a bit of racket across the web, as people interpreted the information in different ways, facts quickly turned into exaggerations, or straight fiction. As Zach wrote:
I have seen many eager Interneters making loud claims to the tune of, “IE8 is dead! We no longer have to support older versions of IE!” However, it’s very easy to get caught up in the pandemonium or start bandwagon-ing and miss the actual facts of what is and will be happening according to Microsoft. I want to clarify some things and set the record straight before we all hang up our Windows XP virtual machines.
What did Microsoft write to cause this, you ask? From the article:
After January 12, 2016, only the most recent version of Internet Explorer available for a supported operating system will receive technical support and security updates.
Zach goes on to explain that there are two important things we can learn from this quote that are worth noting, one of which is the following:
The first is that Microsoft is only stating that they plan to stop providing technical support and security updates for all versions of IE except the most current available for each of their operating systems. The table below shows exactly which versions they mean.
Windows Platform Internet Explorer Version Windows Vista SP2 Internet Explorer 9 Windows Server 2008 SP2 Internet Explorer 9 Windows 7 SP1 Internet Explorer 11 Windows Server 2008 R2 SP1 Internet Explorer 11 Windows 8.1 Internet Explorer 11 Windows Server 2012 Internet Explorer 10 Windows Server 2012 R2 Internet Explorer 11
So where is Internet Explorer 8 in that table? What does the fact that it is missing mean?
…that doesn’t mean IE8 is going away. All this means is that Microsoft is not going to provide updates or support for IE8 anymore; it does not mean that people are going to magically stop using it. The article also mentions that “Microsoft recommends enabling automatic updates to ensure an up-to-date computing experience”, but recommending that it happens does not mean that everyone will do it. Yes, this is a big leap towards a day when developers do not need to worry about IE8 specific styles, but that day is not here yet.
So what’s the second big part? Zach tells us to take a look at that date… January 12, 2016. That’s pretty far in the future… approximately a year and a half. So for the next eighteen months, Internet Explorer 8 will still be alive and kicking, as Microsoft will still be supporting and providing updates for the version. And after that, Internet Explorer will still be around.
You can read Zach’s full post here on our Spark blog. The Spark blog is Perficient’s perspective on all things innovative, and the crew that blogs over there has been posting some really interesting stuff around UX, UI and design. Check them out!