Microsoft

Blog Categories

Subscribe to RSS feed

Archives

Follow our Microsoft Technologies board on Pinterest

PowerShell Deployment to SharePoint Online

In my last blog post about DevOps for SharePoint Online the process I presented relied a lot upon scripted deployment to SharePoint Online (O365). I wanted to expand upon that a little and explain in a little more detail about how Perficient is using PowerShell to manage our deployments for our Development, QA and Production environments.

PowerShell Deployment to SharePoint OnlineAutomating any task which is repeated can be a productivity benefit providing the time invested in developing the automation takes less time than repeating the task itself. Automation also significantly reduces chance of ‘human error’.

Automating deployments is of little benefit to light users of SharePoint who do minimal customization of SharePoint in a single O365 tenant. However, as you begin to customize more and introduce the need for testing cycles then automation starts to become valuable. When you add multiple tenants into your DevOps and add multiple developers or administrators then automated deployment can really pay huge dividends.

I think it is fair to say we are in a period of emerging standards for deployment of customizations to SharePoint Online. When we worked on-premises with SharePoint the WSP provided great deployment options especially when you consider Feature stapling. This is basically off the table with O365 and we’re looking for new best practice.

I think that the combination of PowerShell and the SharePoint Server 2013 Client Components SDK is a strong candidate for best practice automation of deployment to SharePoint Online. PowerShell gives us the lightweight scripting we need in order to move rapidly through automated builds and deployments. The Client Components SDK gives us the full Client Object Model on the administrator’s desktop allowing them to execute on a huge variety of scripted tasks. Here are a couple of useful resources on this topic, one from my colleague Roydon Gyles-Bedford whom I credit with a lot of Perficient’s thought leadership in this area:

https://github.com/rgylesbedford
http://soerennielsen.wordpress.com/2013/08/25/use-csom-from-powershell

At Perficient we have invested in PowerShell Modules which use XML configuration to drive deployment of items such as:

  • Master Pages
  • Page Layouts
  • Content Types
  • Display Templates
  • Term Store Terms

The XML configuration files are pseudo-CAML (Collaborative Application Markup Language!) which is wrapped in our own markup to help the Modules know what to do with it. The nice thing about CAML is that it is already defined and baked into SharePoint. We will often use the Client Browser Tool http://spcb.codeplex.com to browse existing artifacts like Content Types to understand how to define Content Types from scratch. E.g.

ContentType

Aside from configuration defined in XML we also simply drive configuration through PowerShell modules using the Client Object Model directly. Here is an example function for adding a Web:

AddWebFunction

At this point in time the Client Object Model does lack functionality when compared to its server-side counterpart. However, this is improving all the time with new methods being added in every release.

In some cases it is possible inspect the server-side object model using a tool like IL Spy http://ilspy.net and find (unsupported) ways to get the job done. For example we found a way to add links to the Search Center Navigation via this technique. I must stress that using an unsupported method should be for convenience only and you should have a backup plan should it fail. We normally write this backup plan into our deployment documentation and it’s usually just a manual way to achieve the same thing albeit more slowly.

I am now also seeing lots of discussion and examples around HTTP Remote operations to help fill the gaps in the Client Object Model. This is of course also unsupported but can be effective as a convenience and time-saver. We’ve used this effectively to map Search Crawled Properties to the Refinable Managed Properties in SharePoint Online. This is not supported by the Client Object Model and can take a huge amount of time so is ripe for automating. Here is a snippet showing how we call a function to update RefinableString00 with Crawled Properties:

UpdatingRefinableManagedProperties2

In conclusion, automation using scripted deployment can be an extremely versatile and effective way to support your DevOps for SharePoint Online. At Perficient, SCRUM has proven to be a very effective methodology for SharePoint Online projects. Typically we are making the scripted deployment of any new feature part of the ‘Done Criteria’ for any development work. Scripting the deployment then very much becomes part of feature development and will be effectively tested in development environments before progressing to QA and Production.

Could Yammer Supplant Your Intranet?

We see a lot of scenarios where clients are moving their intranets successfully to the Office 365 cloud with SharePoint Online.  This is the easiest, smoothest path to an social intranet on the Microsoft platform, due largely to the ever-closer relationship between Yammer and the rest of the services in Office 365.

That said,there are still plenty of enterprises out there who prefer to either keep their intranet on-premises, or not upgrade / migrate just yet.  Many of those organizations would still like to get their bang for the buck with Yammer, however, and need to figure out a solution for integrating those social features into their on-premises solution.

By far the most common way to accomplish this right now is through the use of the Yammer Embed functionality (or specifically for SharePoint, the Yammer app for SharePoint) to embed specific news feeds on specific sites.  This is easily the most obvious way to “socialize” an on-premises SharePoint intranet with Yammer.

That works, sure.  But it’s not all that elegant.  Too, if you’re using the Yammer app for SharePoint, this approach forces you to go in and update every Yammer feed when they update the app (which is a pain).

A more forward-thinking, less common but emerging approach to a social intranet is to actually use Yammer as the intranet home.

This is an example of truly embracing enterprise social and may require a complete rethink from a lot of organizations as to how they approach an intranet, but it’s the direction things seem to be going.  You make the social network your home, and instead of augmenting informational sites with social feeds, you augment social groups with links to informational sites using Pins and the Info window’s rich text / HTML editor feature.

 

 

 

 

 

 

 

 

 

 

 

Think about it.  Here at Perficient, we’re in the midst of rolling out a new platform for time tracking, financials, and other fun line-of-business activity and reporting.  We have both a Yammer group stood up to support that rollout, and a more traditional SharePoint intranet site.

What we’ve found in this scenario is that the Yammer feed has actually supplanted the informational site because it’s a much faster and more responsive way for people to get answers and collaborate.  Links embedded in the Yammer page direct users back to SharePoint for the informational / non-collaborative content they need, but the social discussion and interaction is now the focus.

Of course, Yammer in general resists (i.e., doesn’t allow) any but the most basic customization.  Fonts, styles, navigation etc., are all locked in “as is”.  The only thing you can really change in Yammer is the header atop your page.  That means we lose some control over branding, but gain quite a bit in interaction and employee engagement.  For this use case, it’s a smashing success.

The question then becomes, “Can this approach work for an entire intranet, and not just one use case?”

To some extent, that depends on the users.  At the end of the day, it all depends on where they go when they log on in the morning.  Email?  The intranet?  Or their social network?  Get the ball rolling with enterprise social and people will start skipping over the intranet– it’s almost a given.  Use social to surface intranet content and the line starts to blur… which is a lot closer to where things are going in the cloud than it is to a hodgepodge of on-prem intranet sites with embedded social feeds.

ProHealth Care’s BI Program, Data Governance & BICC: Part I

This is Part I in a two part series on how Perficient helped to support ProHealth Care in operationalizing their BI program, data governance, and the Business Intelligence Competency Center. Here, I’ll focus on the workstreams and the road map. In Part II, I’ll cover the members of the data governance steering committee as well as the initiation of data governance and data governance priorities.

I’d first like to share the approach ProHealth Care and Perficient took to operationalize ProHealth Care’s BI program, initiate some of the data governance activities, and help to operationalize the Business Intelligence Competency Center (BICC).

As you can see below, we applied Perficient’s Enterprise Information Management framework to focus our activities in developing the road map for ProHealth’s BI Program. We were principally concerned with four discreet work streams to stand up the program, in addition to the core work that’s been undertaken to actually deliver the population health analytics to support the Accountable Care Organization (ACO). Read the rest of this post »

Office 365 Information Recovery

o365dcOne of the many advantages of using Office 365 is freedom from a myriad of worries at the data center (watch this video for an interesting glimpse at Microsoft data centers), server, and application levels.

 One of the most important concerns for any enterprise is that of “business continuity” – making sure a service is both available and, in the event of trouble, restorable with minimum information loss. Two metrics are traditionally used to help define business continuity goals — Recovery Point Objective (“How much data can I afford to lose?”) and Recovery Time Objective (“How long can I wait for the service to be available?”). While Microsoft publishes “financially backed SLAs for up-time” (see   Office 365 Trust Center), it does not provide specific RPO and RTO guarantees. RPO and RTO are, however, published for related services (see SharePoint Online Dedicated and Exchange Online Dedicated service descriptions.)

For most organizations, these ranges of RPOs and RTOs are likely to be acceptable. If they are NOT, the organization will need to design processes to meet the more stringent objectives. It is important to keep in mind that scenarios other than outright Office 365 failure may result in information loss (e.g. accidental document deletion). Some of these scenarios are well supported by the application (e.g. SharePoint recycle bin, versioning, etc.), but others are not (historical file deletions, file corruption, version overwrite, etc.)

For on-premises deployment, a number of 3rd party vendors have developed tools to support a wide variety of information loss and recovery scenarios. For Office 365, tools and technologies are beginning to appear — some of these tools are extension of on-premises technology while others are cloud-only implementation. Here are a few available options:

When looking at these products and services, consider your specific use cases as well as the following:

  • Full platform support – Does the product/service support Exchange, One Drive for Business, Lync, Yammer, AND SharePoint?
  • Integrated tool suite – some of these tools support other Office 365 needs (e.g. governance, data migration). For a larger and/or more complex implementation, a suite will likely prove more valuable than a singular solution
  • Archiving  – does the tool provide support for removal of data meeting certain requirements, or only recovery of missing/corrupt information?
  • On-Premises AND Office 365 – if your organization is transitioning from an on-premises implementation, a solution that seamlessly supports both platforms is ideal
  • Backup Location – some of these solutions use cloud storage exclusively while others provide a variety of targets
  • Target User – some of the solutions are targeted toward business users, while most are for IT professionals

As always, a well formulated set of requirements based upon business needs will make the decision making process easier. Technology in this area is rapidly changing, so always check for new developments.

Why Does Data Warehousing Take So Long?

A common complaint about data warehousing/BI has been time to market.   The investment in real months required to stand up analytics is just too large. Descriptions of the actual time required vary (depending on who you ask, and what their interests are) from a year to 24 months. The numbers are open to debate, but let’s go ahead and stick with the conventional wisdom that Data Warehousing typically Why does Data Warehousing take so long?requires a significant timeline to see results. This assumption then begs two questions:

  1. Does it have to take that long?
  2. If it has to, will it be worthwhile?

Looking at the second question first, there’s a very simple answer: YES. Successful DW/BI projects can utterly revolutionize an organization’s processes and even their outlook. They can shine light on problems, point the way to new opportunities, and improve the daily work lives of employees at almost any level. I consider it a foregone conclusion that there is tremendous value in well-built DW/BI systems.

Of course, the caveat there is the whole “well-built” part. That’s where the delays creep in, and where the real timeline resides. In addition to the experience and expertise brought to the design and construction these systems, the degree of involvement of the business also plays a very large role in how successful the solution will be. Too many gaps or failures on either side can result in less-than-satisfactory outcomes being reached after spending a lot of time and effort.

So that leads us back to the first question above: does it have to take that long? I mean, upwards of 2 years to build a decent Business Intelligence solution? This answer is not nearly as easy because the factors that contribute to extending timelines in DW/BI projects are numerous and varied. For instance:

  • Are the builders of the system planning development closely with the consuming org? If not, extend the timeline.
  • Is the business committed to providing solid requirements on an ongoing basis? If not, extend the timeline.
  • Is the development team sufficiently experienced and under solid technical leadership? If not, extend the timeline.

You get the point. The development work in and of itself is not necessarily what takes a long time. What takes a long time is when business needs are misunderstood or disregarded, when expectations aren’t managed, when the chosen technology platform is not well-aligned to business requirements — basically, when either side doesn’t fully understand what they are getting into, and there is misalignment in that area.

In the next few posts, I’ll go over various tools and techniques currently in the market that offer some kind of acceleration of the data warehousing process, and see what paths are available to speed up time-to-analytics. I will include the tool class sometimes variously referred to as either “frameworks” or “accelerators”. I’ll talk about iterative development and the potential risks and benefits of using Agile methodologies. And I’ll discuss possible ways that planning in itself can help deliver results sooner rather than later.

Next time: Accelerators and Frameworks. Hope to see you then!

 

The Sitecore Symposium Experience For You

Jamie Stump, Parshva Vora, myself, and others from the Perficient family attended Sitecore Symposium this past week. We absorbed a lot of knowledge about what is upcoming with Sitecore 7.5 and Sitecore 8. The cadence communicated from Sitecore is around “experience”. The building blocks are being put in place for you, our clients, to help your customers have a custom and personal experience while we as a The Sitecore Symposium Experience for Youpartner provide solutions that allow you to place “experience before content”.

Sitecore is moving forward to be a top-tier provider for your marketing and communication goals through the Sitecore Experience Platform. With real-time marketing, growing demand through multiple channels, and customizing your content to your individual customers, you have the abilities to have dedicated customers for life through an enriching experience for them through you.

Now trying to empathize by putting myself in your shoes after reading those paragraphs, I would probably think – “Well that is a great bunch of words, but what does that really entail for me”.

Well it means a new approach to analytics data. Sitecore’s xDB, using MongoDB, provides the ability to store large amounts of data about your customers. It plays well with your current infrastructure and is extensible to your profile needs as well as scalable through various database design principles and patterns. Along with this, reporting of the analytics information is improved to use all of that collected information.

Read the rest of this post »

A “Connected Consumer” turns into lifelong customers

What a week it was! I am referring to the last week spent at Sitecore Symposium North America and Annual MVP Summit that took place in Las Vegas. There was plenty to absorb with as much as seven sessions in progress at the same time. Sessions were divided into three different tracks: Product, Business and Developer. Obviously I couldn’t make it to all, but I did attend a good mix of them. All sessions were diverse in terms of subject matter. However, from opening keynote to closing keynote, the emerging theme was clear, and it was the Connected Consumer Experience!

personalizationWell, the concept of the consumer experience is not entirely brand new. At the symposium, stronger emphasis was placed on the term ‘connected’. The digital marketing landscape is continuously shifting as customers are engaging in doing business across several channels – email, website, mobile sites, apps. social media, CRM etc., and it poses at least two immediate questions for any organization that takes their customers seriously.

  1. Are we ready, as an organization, to do business with customers across these diverse channels?
  2. And, if so, do we have the infrastructure and solutions in place that drive for us a single view of our customer across online and offline touch points so that we can offer them a connected and meaningful experience?

Read the rest of this post »

Everything You Need to Know About Delve & Office Graph

Ok, I’ve got to admit I really meant to say “Almost everything you need to know in first Release.”

The more you share, the more you get. Believe in that? Office 365 community does and as a result , this week Microsoft hosted “Delve Yamjam” to coincide with the launch of the new Office 365 product called “Delve”. (If you are new to I highly recommend reading earlier articles here and here to get to know your new friend Delve). Look at a screenshot of Delve from my demo tenant, looks pretty cool, huh?

Delve Img1

Some great questions asked some great thoughts shared. I summarize here for the larger community. Microsoft responses were from Christophe Fiessinger, Kady Dundas, Josh Stickler, Mark Kashman, Cem Aykan and on the phone Ashok Kuppusamy, Stefan Debald, Fredrik Holm, John Toews, and Robin Miller.

  • Which Office 365 business plans includes Delve?
    • Delve is included in the Office 365 E1 – E4 subscription plans (and the corresponding A2 – A4 and G1 – G4 plans for Academic and Government customers respectively)
  • Can I protect data from ever being shown in others Delve results?
    • Yes, Delve only shows documents based on permissions set and inherit those from OneDrive and SharePoint online. Also each card will have a sharing control and “who can see this” option
    • If your folder and contents are not shared with anyone, they will not appear in Delve for anyone. It always respect the permissions set on the items.
  • Which kinds of data is considered “private data”?
    • There’s both the concept of private data (e.g. files that only you or you and a select few colleagues can see) and private signals (e.g. the fact that you have viewed a particular document, even if it’s public). Delve respects SharePoint and Search permissions, so only users who have access to read a document can see that document appear as a result in Delve. Furthermore, details like the documents you view or documents others view are private.
  • Any Android / iOS apps in the pipeline for Delve?
    • Yes but no timeline could be provided yet
  • Not all content (file types) is included in Delve. Any plans for extending the list of file types, and/or list of content sources?
    • PDF, excel, and word file types are included but there is absence of image files and Visio files.
    • Yep, we are planning to add more content sources and signals to the Office Graph on ongoing basis
    • We are working on increasing the content types supported by Delve. We started with an initial list of Office doc types, but we will expand this over time.
  • Delve site has default branding and does not incorporate our corporate branding that is available on Yammer, OneDrive and Sites menu options in top navigation bar?
    • The top Office 365 navigation is now theme able and your theme should be available in Delve as well. Broader theming is something we’ll be looking at in the future.
  • Delve was rolled out to our business tenant yesterday. So far it is showing us trending documents that our co-workers are viewing on SharePoint. Is there a way to block certain areas so we don’t see our co-workers trends in HR searches?
    • You can make those documents not shared using the SharePoint permissions UI, but right now, there’s no feature to exclude documents from Delve but still available to everyone.  read here for more details.
  • Will Outlook be leverage into Delve
    • Outlook as part of Office 365 is already leveraged in Delve.
    • We are considering adding email attachments to Delve.
    • Office Graph is driving scenario for OWA. So appointments and attendee information are only leveraged in delve if it’s in OWA. You can imagine Office Graph providing insights multiple scenarios in the future…if you haven’t already done so check the Office Graph on the blog post from Monday.
  • Does ‘signals from exchange’ refer to email relationships (i.e. who the recipients and senders are)?
    • Yes, and to elaborate, it analyzes the set of people with whom you correspond via email and use this data as a factor to weight your working relationships with your colleagues.
    • The org structure is another factor taken into consideration
  • The 5 people to the left – seems to be right for most people (in terms of the ones with most interactions), but I have seen colleagues, with strange people presented as top 5 people.
    • have a bug where it is showing groups/crawler accounts instead of just people
    • The people on the left aren’t related to them in any way. Known issues MSFT working with no ETA
  • Will Delve work in a hybrid scenario using my On-Premise systems?
    • This is place for partner opportunities! But MSFT is working on a solution to feed on-premises (like exchange on premise) content into Delve, but no timeline can be announced.
    • Plans to release hybrid connector capabilities so that the Office Graph can integrate signals and content from on prem.
  • Any federation plans across multiple tenants?
    • No plans today
  • Delve supports the most common screen readers, high-contrast mode etc aligned with Microsoft policies in this area.
  • Is there a way to limit #delve deployment to some user groups in the company? Just to help company to graduate deploy it
    • An individual user can turn off Delve. This will also control Office Graph as-well.
  • Are you adding Delve results to the search page, or can we see this as an UI opportunity
    • Not to SharePoint enterprise search center but we look at that as an opportunity
  • Item limit for Delve
    • Delve shows up to 36 items in a view. This is the same when you search in the search box.
  • Details to the API Roadmap?
    • Right now you can do graph queries through the SharePoint Search Rest API using “Graph Query Language” as described here: GQL
  • If a user has permission to access a document/list item but the library/list is excluded from search in list settings will the content still display in Delve?
    • Nope, Delve uses the same permissions for search..
  • Do you have plan to return Yammer conversations in any form as Delve results ?
    • It’s something MSFT is actively working on showing the Yammer conversations tied to documents in Delve.
  • Is Delve going to work with Office 365 Pro Plus client or only Office online, and the other question, is it only working based on files saved only in ODFB and SharePoint online?
    • Yes if the document is stored in OneDrive for Business or SharePoint Online then yes the Office Graph will index it
  • What is the best way to introduce Delve within an organization? Are there best practices and change management recommendations?
    • We are working on an email template that Office 365 admins can then send to their users that helps address exactly what you’re asking. It would have info about What, How, Why with links and first steps. This template will be made available to admins via the message center to raise awareness.
    • We, too, plan to incorporate Delve info and insight into the adoption website we currently maintain here: Discover SharePoint  (with near-term plans to focus on broader Office 365 scenarios).

Hope this provides some insights around how Office Graph captures and renders signals. Check back for more details as I dive more into Delve.

Office 365 pushes Microsoft atop Gartner’s Social Magic Quadrant

Microsoft position as a Leader in Gartner’s 2014 Magic Quadrant for Social Software in the Workplace has moved to the top. Read the rest of this post »

Webinar Recap: What to Know When Migrating to Microsoft Exchange

On Tuesday, we teamed up with Binary Tree, Microsoft’s 2014 Messaging Partner of the Year, for a webinar on Best Practices & Solutions For Migrating to Microsoft Exchange.

The session delved into Office 365, common challenges when migrating to Exchange, along with an example of a customer who recently migrated to Exchange Online with the help of Binary Tree’s solution, and then, how Binary Tree’s CMT Suite works with a demonstration of CMT for Coexistence and CMT for Exchange.

First, Perficient’s Rene Strawser, whose role as a lead technical consultant allows her to focus primarily on the deployment and migration of the Microsoft unified communications technologies of Exchange, gave attendees a bit of background on the trends surrounding the cloud, and specific features of cloud-based Exchange in Office 365.

Following this, James Tolentino, another lead technical consultant at Perficient, shared the common challenges he’s worked through when migrating customers to Exchange from legacy email platforms, and then walked through a situation where a customer recently moved from a legacy email platform to Exchange Online. He described the key features of the Binary Tree tools that were used as well as an overview of the migration process. This included the criticality of end user communication and the use of PowerShell commands and replica/staging.

For the second half of the webinar, Binary Tree solution architect Perry Hiltz went into further detail on Binary Trees’s award-winning SMART migration software solutions, CMT for Coexistence and CMT for Exchange, and then gave the audience a live demo of the tools in action.

You can view the entire replay here, including the demonstration. You can also catch up with the speakers on Twitter: @srstrawser and @PWHiltz.