Microsoft Enterprise Technologies http://blogs.perficient.com/microsoft Perficient is proud to be partnered with Microsoft Fri, 21 Nov 2014 20:08:31 +0000 en-US hourly 1 http://wordpress.org/?v=4.0.1 Copyright © Microsoft Community 2011 gserafini@gmail.com (Microsoft Enterprise Technologies) gserafini@gmail.com (Microsoft Enterprise Technologies) 1440 http://www.perficient.com/About/~/media/Images/About/perficient_logo_small.jpg Microsoft Enterprise Technologies http://blogs.perficient.com/microsoft 144 144 Perficient is proud to be partnered with Microsoft Microsoft Enterprise Technologies Microsoft Enterprise Technologies gserafini@gmail.com no no Yammer Sign-in Now Tied with Office 365 http://blogs.perficient.com/microsoft/2014/11/yammer-sign-in-now-tied-with-office-365/ http://blogs.perficient.com/microsoft/2014/11/yammer-sign-in-now-tied-with-office-365/#comments Fri, 21 Nov 2014 20:08:31 +0000 http://blogs.perficient.com/microsoft/?p=24424 Keeping up with the roll out momentum, Microsoft released a significant feature last week which allows you to use your existing Office 365 credentials to access Yammer. It essentially provides the same experience as when using OneDrive for Business, Outlook, and SharePoint.

yammer signin

 

Here are some facts which will help you understand what’s covered in this release:

  • The easiest way to know when this is coming to your tenant is to check the announcement in the Office 365 message center (if your tenant fulfills the requirements). Expected rollout is slated to start in December. The global menu will change and Yammer will be included and styled as the other Office 365 services. e.g. Outlook or SharePoint.
  • This feature will support many of the scenarios previously covered with Yammer SSO, but it doesn’t cover DirSync. Yammer DirSync will still be required. Microsoft is continuing to invest in Azure AD integration and have plans that cover Dirsync in the future. Note that some of the Yammer SSO scenarios are not yet covered with Office 365 login. These scenarios are covered in the documentation here.
  • When you connect to Yammer, you’ll be routed to the Microsoft login screen, enter your credentials there and then you would be redirected to your company’s ADFS server for authentication. You will be then redirected back to Yammer after ADFS authentication.
  • If the email address used to create a Yammer account is not part of your Office 365/AAD account, then the account won’t be mapped and you will continue to login using Yammer login.
  • Longer term as Microsoft continue to deepen the Yammer, Office 365, and Azure Active Directory integration, expect Office 365 active directory sync, Yammer DirSync and O365 DirSync be combined in the future.
  • A new option to synchronize from Azure AD (rather than on-premises AD) is on the backlog.

 

Hopefully you are as excited as I am with this announcement; Happy Yammering!!

 

]]>
http://blogs.perficient.com/microsoft/2014/11/yammer-sign-in-now-tied-with-office-365/feed/ 0
Troubleshooting Search in SharePoint Online (O365) http://blogs.perficient.com/microsoft/2014/11/troubleshooting-search-in-sharepoint-online-o365/ http://blogs.perficient.com/microsoft/2014/11/troubleshooting-search-in-sharepoint-online-o365/#comments Fri, 21 Nov 2014 18:38:06 +0000 http://blogs.perficient.com/microsoft/?p=24421 SharePoint-logoSharePoint makes great use of its Search engine and it is pervasive throughout most SharePoint solutions today. Whether you are building Content Search WebParts leveraging Display Templates or creating a custom Search center you will be making use of SharePoint’s powerful and mature Search engine. However, what happens when things behave unexpectedly? How can you troubleshoot Search? This article will focus mainly on SharePoint Online (O365) but could also be applied to Search on premises.

Search is not magic!

When supporting others in their troubleshooting of Search issues I usually start by explaining that Search is not a black art. I would agree there are nuances but, for the most part, figuring out issues can follow simple scientific process. After all, SharePoint Search is just a collection of properties stored for query and retrieval by the Search engine.

The perception that Search is a black art often extends to users and their expectations. When search does not yield the results a user expects we should always investigate what they expect and why they expect it. Work directly with end users and try to help them understand why they are not getting what they expect.

Approach to Troubleshooting

Generally speaking it is advisable to try and make small but smart moves when changing configuration. It can sometimes take days or weeks to test the results of your configuration changes if long re-crawls are required. Above all, be scientific! Use the same test cases before and after changes and measure your results by tracking search results before and after.

The two sides of Search

There are two sides to Search, Crawl and Query. Potentially either side could be failing or not working as expected so it is important to investigate both.

SharePoint Online in O365 does not currently offer too much control nor insight into the Crawl side (this may change in future). SharePoint 2013 on premises offers lots of opportunity to troubleshoot the crawl. On premises we can:

  • Inspect the crawl log
  • Turn on verbose logging during crawls
  • Attach crawls to Fiddler

The Query side can be inspected almost equally online as it can be on premises. It’s important to understand that Search queries are transformed as they are passed to the search engine. The user can type a query into a search box but there are number of places the query could be transformed or results influenced by search configuration:

  • Search Results WebPart e.g. in a Search Center
  • Content by Search WebPart
  • Results Source
  • Query Rules
  • Synonyms

Inspect the Query using the Query Builder

There are at least two great ways to inspect the Query side of Search.

One, use the OOTB Query Builder to help you. From a Content by Search or Search Results WebPart you can open the Query Builder. Switch to Advanced Mode (you are most definitely Advanced now!) and then head over to the TEST tab. Once on the TEST tab click ‘Show more’ to see the full transformation of the query in the bottom text box (highlighted).

QueryBuilder

Seeing the full transformation of the query is important because there may well be transformations impacting results you had not thought of. In addition if any part of the query is using dynamic property values based upon the Page or User (e.g. {User.ProfileProperty}) these values will be expanded and displayed for inspection.

Inspect the Query using the 2013 Search Tool

The SharePoint 2013 Search Query Tool is now an essential part of my day-to-day work with SharePoint Search. It uses the REST API to retrieve results from SharePoint Search and can be used with SharePoint Online (O365) and On Premises SharePoint 2013.

http://sp2013searchtool.codeplex.com

The tool provides a fast, convenient and repeatable way to inspect Search results. A few tips for using this tool most effectively:

  • Be scientific! Use fixed queries, analyze results before and after configuration changes.
  • Turn off Trim Duplicates, it will be ON by default. Trimming duplicates can be really confusing when analyzing results, especially when you have repetitive test data in a system.
  • Ensure you are using the correct Results Source. By default this will be set to Local SharePoint Results. However, if custom Result Sources are in play be sure to set them in the tool. You will need the GUID identifying the Results Source, this can be obtained easily by browsing to the Results Source in SharePoint and grabbing the ID from the querystring.
  • Inspect the Rank values returned. The rank values will determine which results appear first. If any parts of the transformation use XRANK to boost results then this should be evident in the Rank value.

Check the basics

Are you writing valid Keyword Query Language (KQL)? SharePoint Search can only understand KQL and not syntax from other search engines such as Google, Solr, Westlaw, Lexis etc.

http://msdn.microsoft.com/en-us/library/office/ee558911.aspx

If the user is expecting a specific document to be returned has that document actually been indexed? This can be verified by performing a Path search e.g.:

Path:http://contoso.sharepoint.com/documents/document.docx

On Premises you could also check the crawl log to see if the item hit an error during indexing.

If the search query is expecting to hit document body content you should also check that the source text is readable by the search engine. In the case of PDF documents, they will need to be OCR’d.

Managed Properties

It’s likely that Search is not behaving the way users expect due to Managed Property configuration. Managed Properties are really the backbone of the search engine and our greatest opportunity for customization. They are also our greatest opportunity to make mistakes in configuration. We need to check the Managed Property settings. This article does not fully explore Managed Property configuration but here are the basic settings we should consider when troubleshooting:

  • Searchable – Enables us to query against the content of the property, the content is stored in the full-text index. On premises we can also set the ‘Context’ of the property which will allow different weighting to be applied.
  • Queryable – Enables us to query directly against the property e.g. Title:Hines . Note that this is different to simply being included in the full text index.
  • Retrieveable – Enables us to retrieve the value of the property in search results. If a property is not retrievable it can sometimes be confusing when trying to inspect it using the search tool. The value may be Queryable and used in queries but not retrieveable so that we can inspect it.
  • Refinable – Means that it can be used in a search by refinement. Note that this is different to querying properties.
  • Sortable – Means that results could be sorted by this property.
  • Mappings – Managed Properties will be mapped to one or more Crawled Properties. This relationship is worth reading more about and the necessity to re-crawl after changing these values in important to consider. When more than one Crawled Property is selected be sure that the configuration is resulting

It’s really important during any Search project to have consistent and well-understood Managed Property configuration. At Perficient we often build a Managed Properties Specification which allows us to collaborate on how properties will be configured. In addition it is highly recommended to script Managed Property configuration so that Development, QA and Production environments are the same. It’s very easy to let this slip and have misconfigurations confuse you when testing across multiple environments. More on script deployment to O365:

http://blogs.perficient.com/microsoft/2014/09/powershell-deployment-to-sharepoint-online

OOTB Title Managed Property

The ‘Title’ Managed Property is probably the single greatest point of confusion I see in SharePoint Search projects. Looking at the Mapping configuration it is already apparent that it’s complicated.

TitleManagedProperty

Unfortunately the MetadataExtractorTitle has, in my opinion, only added to this confusion. If your users are relaxed about seeing an interpreted Title in search results then you will most likely not get feedback. However, it appears most users get confused about this and require us to troubleshoot exactly why their Title is not displaying as expected in Search results. In response to this, you can of course look at the Crawled Property ordering and try to determine exactly why the Title Managed Property is populated as such. However, if it transpires this is a major problem for users I would look into creating your own separate Managed Property for handling Title.

Low level Debugging

Detailed debugging during a crawl or query is reserved for SharePoint on premises. During a crawl we can turn on VerboseEx logging and analyze the detail of what’s happening during acquisition. This could point to a number of problems including communication with the source during a crawl or an issue with security trimming when querying.

If you need to pursue a Search support incidents with Microsoft it is most likely Microsoft will ask for VerboseEx log. The only caution with this is that VerboseEx will grow the logs very quickly so it’s recommended you ensure it is only enabled for a short period of time.

Good luck!

SharePoint Search does have its challenges but it is also extremely powerful and offers great patterns for surfacing content. We see it making its way into new areas and services in Office 365 all the time. Embrace it! I think it’s here to stay.

]]>
http://blogs.perficient.com/microsoft/2014/11/troubleshooting-search-in-sharepoint-online-o365/feed/ 0
Video Comes to Office 365 and SharePoint Online http://blogs.perficient.com/microsoft/2014/11/video-comes-to-office-365-and-sharepoint-online/ http://blogs.perficient.com/microsoft/2014/11/video-comes-to-office-365-and-sharepoint-online/#comments Thu, 20 Nov 2014 18:06:56 +0000 http://blogs.perficient.com/microsoft/?p=24414 This week, Microsoft announced the release of an Office 365 Video Portal (see Office 365 Video ).

This is an exciting first step into an area of great demand and large potential. In the past, many larger enterprises purchased dedicated 3rd party solutions for the management of video content. Smaller organizations typically leverage You Tube.  SharePoint deployments at small and mid-size companies sometimes use native Video content types and/or custom solutions.

The Office 365 Video portal goes above and beyond SharePoint video content type in many important ways:

  • The Video Portal is somewhat akin to the “old school” SharePoint Document Center.  The Video Portal is a dedicated “home” Site Collection with support for securable “channels”.  Each channel can have completely independent permissions and administrators and is mapped to a unique site collection.
  • Uploaded videos are converted for playback on a variety of formats/media/qualities.  At playback time, the video portal dynamically adjusts the playback (every 2 seconds) to use the most appropriate format for the device/bandwidth conditions.
  • The original upload is managed in a SharePoint site collection (and incurs storage cost); the converted files are managed via Azure Media Services and do NOT result in additional storage charges
  • Videos will appear in Enterprise Search results and views are tracked in analytic reports.

Its important to note that this is a v1 product with the following areas for future improvement:

  • Playback is via Flash, not HTML 5.  This represents a large roadblock for Mobile device use.  Microsoft acknowledges this and HTML 5 support is clearly on the roadmap
  • Uploaded videos are limited to 2GB with larger limits (aka OneDrive consumer version) undoubtedly in the works
  • Videos live in their “host” site collection and are not  integrated across all tenant site collections via apps; simpler link sharing/embedding is possible
  • Extranet and/or Intranet use cases are not currently supported

 

Even with the v1 caveats, this will be a great value added for many Office 365 customers.  The feature can be enabled/disabled at the tenant level, so organizations will have time to evaluate the ROI and launch when appropriate.

 

 

]]>
http://blogs.perficient.com/microsoft/2014/11/video-comes-to-office-365-and-sharepoint-online/feed/ 0
Office 365 – How to Handle “Large Messages” During Migration http://blogs.perficient.com/microsoft/2014/11/office-365-how-to-handle-large-messages-during-migration/ http://blogs.perficient.com/microsoft/2014/11/office-365-how-to-handle-large-messages-during-migration/#comments Tue, 18 Nov 2014 16:00:33 +0000 http://blogs.perficient.com/microsoft/?p=24268 Exchange Online provides for a fair amount of flexibility; that said, there are a few aspects of the service that cannot be changed. These service restrictions are documented in the Exchange Online Limits for each of the subscription types.

One limitation that you may encounter when migrating to Exchange Online is the “Message Size Limit” of 25 MB. Depending on your current on-premises limit, mailboxes may contain messages that exceed 25 MB; these “large messages” will cause issues during your migration. Your first encounter with large message may be a result of the error “This mailbox exceeded the maximum number of large items that were specified for this request” during a mailbox move.

There are a number of methods to address large messages with each method causing a varied level of impact to your users. After determining that there are large messages in your environment, one of the first questions to answer is what do you want to do with them?

The messages can’t be migrated to Exchange Online, so there are a few options:

  • Export the large messages to a format such as .MSG, .EML or PST and archive them
  • Delete/archive just the large attachment from the message and migrate the remaining message body
  • Skip them during the mailbox migration, essentially deleting the messages and their attachments

Since we’re talking about moving, modifying or deleting messages, you will want to take into consideration any email retention requirements in your organization and how your decision may impact mailboxes involved in eDiscovery.

Identify

Before we can do anything with large messages, we need to be able to identify them. This is where you’ll first need to decide how much burden you place on your users to help out with this task.

The most burdensome approach would be to ask your users to “just go search their mailbox” for items over 25 MB. While this might work for some organizations, we can do better.

Search FolderNext up would be to create a “view” in the user’s mailbox of all their large items using a feature called “Search Folders”. You could provide your users instructions on creating a Search Folder for messages over 25 MB or you can go one step further and create the Search Folder for them using a combination of Exchange Web Services (EWS) and PowerShell. This leaves the user with an easy view to see all of their large messages that need action taken.

Aside from creating a Search Folder in the mailbox, you can report on these large messages using PowerShell, EWS or a combination of the two. Keep in mind that generating a report across the organization can be quite time consuming given you are generating a query for every mailbox.

Take Action

Once we’ve identified large messages, you’ll need to decide what to do with them. Some organizations take the “high user burden approach” and once they’ve shown their users the large items, they leave the action up to user themselves. This could include saving the message locally as an .MSG, .EML or .PDF file, exporting to a PST or something as low-tech as printing the item. You’ll probably find that user compliance is not incredibly high with this approach and you’ll still have a number of large messages remaining in your mailboxes.

Export

Assuming you’re not going to leave the work to your end users, you have some options for retaining the large messages outside of the user’s mailbox. Certainly no one wants to generate more PSTs in their environment but if the large messages need to be retained, this is a viable option.

Depending on your version of Exchange, you can use PowerShell to export the large messages to PST. You could use either “New-MailboxExportRequest” (2010) or “Search-Mailbox” (2013) cmdlets to accomplish this task.

Below are some sample commands you can start with:

New-MailboxExportRequest -Mailbox testuser -ContentFilter {Size -gt 26214400} -FilePath "\\SERVER\D$\Exports\testuser.pst"

Search-Mailbox testuser -SearchQuery "Size>26214400" -TargetMailbox "LargeMessageMailbox" -TargetFolder "Inbox"

If you’re looking for a more complete solution that is already put together, there is a pretty thorough script in the TechNet Script Gallery that does much of the work for you: [PowerShell Script] Office 365 Large Item Compliance v2.1

As an alternative, MessageOps has a “Large Message Exporter Tool” that is no longer supported but may be an option.

Modify

The middle ground between moving the entire message or deleting it would be to remove only the attachment. This is something that might be best left up to the end user to save the attachment off to disk and then delete it from the original email. Be prepared to write some thorough documentation and plan for a lot of hand-holding here as there are a number of steps involved.

It may be possible to automate this process via EWS but the above alternatives are probably better when compared to the effort involved here.

Delete

Perhaps you’ve decided to delete any remaining large messages as users have saved what they needed or you exported the data out for them or you’re just fine with those messages going away. Assuming the user communication has been taken care of, you can proactively delete the large messages via PowerShell or just skip them at the time of migration.

To skip the messages, use the “-LargeItemLimit” and “-AcceptLargeDataLoss” parameters during your migration to skip the large messages.

Summary

  • Large Messages may need to be addressed depending on your current message size limits.
  • As an organization, you will need to decide where these Large Messages go given that they cannot be migrated.
  • Keep in mind any legal eDiscovery or email retention requirements your organization may have.
  • The approach you choose will create a varied level of burden upon your end users.

Additional Notes

While the above article refers to the message size limit being set to 25 MB, the value is in fact technically 35 MB. Microsoft considers the “stated limit” as 25 MB and then provides 10 MB of “growth allowance” to provide for the overhead during processing of the message.

You’ll also find a number of references to the “-LargeItemLimit” parameter where it is incorrectly believed to be the size of the large item you’re allowing; this parameter is actually the number of large items you’re willing to skip, not the size of the items.
 
Did you find this article helpful?

Leave a comment below or follow me on Twitter (@JoePalarchio) for additional posts and information on Office 365.


]]>
http://blogs.perficient.com/microsoft/2014/11/office-365-how-to-handle-large-messages-during-migration/feed/ 0
Hybrid Analytics in Healthcare with O365 & Power BI Webinar Recap http://blogs.perficient.com/microsoft/2014/11/hybrid-analytics-in-healthcare-with-o365-power-bi-webinar-recap/ http://blogs.perficient.com/microsoft/2014/11/hybrid-analytics-in-healthcare-with-o365-power-bi-webinar-recap/#comments Tue, 18 Nov 2014 14:22:32 +0000 http://blogs.perficient.com/microsoft/?p=24390 Last week, we had our Microsoft last business intelligence focused webinar of the year, “Hybrid Analytics in Healthcare: Leveraging Power BI and Office 365 to Make Smarter Business Decisions.”  Heidi Rozmiarek, Assistant Director of IT Development for UnityPoint Health, spoke, along with our Microsoft BI team, on implementing an analytics platform in a hybrid environment. WebinarReplay

First, the Perficient team covered architectural components and functions, architecture options including on premises, hybrid, cloud,  and delivery considerations. Next, Steven Gregor, a technical consultant on our Microsoft BI team, reviewed Power BI and its features, including the security model and client side Data Management Gateway, and then walked through a live demo.

Last, Heidi shared how her organization is architecting a successful analytics infrastructure using Microsoft technologies. She explained how UnityPoint Health is leveraging Microsoft’s BI stack to provide simple solutions for complex questions. Heidi shared how they built the solution, collected and cleansed the data, modeled the data, and visualize and report the answer. She wrapped up by sharing her organization’s plans to move further to a hybrid on-premises/cloud solution in the next few months.

Heidi’s key takeaways are worth noting:

Power BI transforms Excel

  • Power Query allows for data acquisition & cleansing
  • Power Pivot enables data modeling & cube development
  • Power View & Power Map fuel advanced visualization

Power BI & SSRS transform SharePoint into an analytics hub

  • Sharing of Power Pivot data models
  • Sharing of Power View dashboards
  • SSRS running in SharePoint integrated mode allows reports to sit alongside dashboards

End user adoption is rapid

  • Familiarity of Excel & SharePoint allows quick ramp up
  • Analyst layer of the organization will flourish
  • Build an environment that can “grow with you”

Watch the replay here.

During the session, I mentioned a case study, published by Microsoft, on Heidi’s work at Meriter Health Services. It describes how Meriter deployed a self-service BI solution leveraging the Microsoft stack to extend its EHR solution. This solution enables the hospital to provide more effective patient care and save more than $1 million annually.

]]>
http://blogs.perficient.com/microsoft/2014/11/hybrid-analytics-in-healthcare-with-o365-power-bi-webinar-recap/feed/ 0
All About ‘Clutter’ – Home Run for Microsoft Office Graph http://blogs.perficient.com/microsoft/2014/11/clutter-another-home-run-for-microsoft-office-graph/ http://blogs.perficient.com/microsoft/2014/11/clutter-another-home-run-for-microsoft-office-graph/#comments Mon, 17 Nov 2014 18:15:17 +0000 http://blogs.perficient.com/microsoft/?p=24331 We all receive email that we may have signed up for (such as a blog posts, newsletter) and that isn’t exactly junk, but is less probable to get our attention. Clutter uses the intelligence of Office Graph to see how important (or unimportant) this email is to you. It learns over time your levels of importance, then uses that analysis to separate the clutter from other inbox items.  You can quickly scan the clutter, mark individual items as “not clutter,” and take action on the rest of it, such as deleting it all. And if you don’t like the feature, you can turn it off.

Statement from office team – “Clutter learns from your actions to determine the messages you are likely to ignore,” . The underlying idea is quite simple: Microsoft wants to leverage the knowledge it gains about your daily work activities to automatically filter out emails that don’t immediately need your attention.

The flow works something like this : First emails considered as SPAM are rejected before its delivered to your mailbox. Then if a message lands in the inbox, it gets routed through the rules, and eventually Clutter does its magic. As a result, Clutter will get smarter over time, by learning your prior actions with similar messages, and determining the type of content and even how you are addressed in the message. This means the Clutter experience is personalized to each individual, based on their actions and their preferences.

Here is a two step process to turn ON Clutter for your Office 365 inbox. Remember it’s a individual user setting, and currently no way to automate the process for multiple users.

Clutter01

Clutter02

After turning ON Clutter, you should receive a welcome email similar to this

Clutter03

We’ve seen how you get this feature for your inbox, now let’s dig a bit deeper to understand the various facts which may help you.

SOME INTERESTING & FUN  FACTS

  • Available to all SKU’s

  • Clutter works in all clients (in terms of moving messages to the clutter folder and learning from your behavior), though it currently requires OWA to turn it on.

  • Clutter is only a feature on individuals mailboxes.

  • Not available for Office 365 groups mailbox.

  • In order get the Clutter feature earlier, enable your Office 365 tenant to receive First Release.

  • It is an Exchange feature, not a client feature. That means in mobile or desktop no special client is required. They will see Clutter emails in their Clutter folder. The items will not show up in their Inbox. Clutter is available in all mail clients that can display and sync folders from your mailbox. i.e. Smartphones, tablets, etc.

  • If the user is not reading the Yammer notification emails, then Clutter will see that they are not important to that individual and will move them out of the inbox into the Clutter folder. You will need to drag the Yammer notification email into their Inbox folder, and then Clutter will learn that messages from Yammer are important and will leave them in your Inbox in future.

  • Clutter is a user preference feature, similar to electing to use conversation view. No current plans to add clutter specific admin controls, however Microsoft is looking at how to add broader tenant rollout controls.

  • Clutter is trained approximately once per day. Moving items out of Clutter is learned during the next training cycle.

  • You can train Clutter from your phone.

  • Clutter looks at the folders at the Exchange level, independent of client.

  • Clutter is only available in online version. It is not planned for next version of Exchange Server. Clutter requires rapid feedback to fine-tune the machine learning which isn’t possible in an on-prem deployment and also requires additional server processing resources that would impact the sizing for the on-prem server.

  • Q: What is the processing order between Junk Mail, Clutter and custom rules?
    A. Junk Mail first, then custom rules, then Clutter. (AFAIK)

    Q: Can we have a rule which force something to go to clutter / not go to clutter?
    A. Yes, but if you force a message into Clutter it won’t go through the Inbox and therefore won’t be considered as a signal for learning your preferences

    Q: Can we have a rule which operates on e-mail which does arrive in Inbox/Clutter only?
    A. All rules operate on inbound mail destined for the Inbox unless redirected by a rule. Clutter currently applies after all server-side rules. Clutter is not aware of client-side rules. There is a trick that allows you to force an item to not go through Clutter processing. You can create a rule which moves the item to a folder or your Inbox and Clutter won’t be processed on messages which that rule applies to. You can’t control the order in which rules and Clutter run. Today Clutter will always run after your last rule.

    Q: Can we create a rule and specify when it should be processed (before/after Junk Mail, before/after Clutter)?
    A. No

  • Q: I added clutter yesterday afternoon and now most of my mail goes to clutter?
    A: Check for – If you get a lot of circulars and non-personal email that could be regard as clutter. Possibly most email in your inbox was left marked unread before enabling Clutter

  • Q: Does Clutter learn from the Content / subject of an email to classify?
    A: Yes. Move the Internal communication email into Clutter and leave the Sales email in your Inbox and all will be well.

  • Rules vs Clutter

    If you want to apply that level of precision, use rules. Or turn them off to let Clutter do its stuff… but accept that all of the messages will go into a single folder.
    Clutter only learns from your choices, as everyone prioritizes their inbox differently.

  • Diff between JUNK and CLUTTER

    As an example Clutter can learn which internal distribution lists you read and which ones you don’t and will move the ones that you don’t read to Clutter. Junk Email doesn’t do that. You can think of junk email as being something that is not personalized – junk for one person is junk for others. Clutter on the other hand, is highly personalized – it’s the set that you tend to ignore, but others may not. junk is something that you really do not want to see in the future. Clutter is email that you will probably want to look at, but not as a high priority activity.

  • If you are one of those users who select multiple messages and mark them as read then clutter will not learn from any signal. You need to either leave messages unread or move them to Clutter folder (via drag drop or right click move to clutter) or yet delete them as unread to generate a clutter signal. Microsoft is looking at improvements for the model that will learn from users who mark everything as read.

  • If you are using Clutter nothing has populated the folder then try moving some messages to the Clutter folder and it should get signals.

  • Clutter only works on emails which are going to your Inbox. If a rule moves the email first then Clutter won’t move it or train on it later.

 

Source: Office 365 Community Network; Microsoft.com; Office Blogs

]]>
http://blogs.perficient.com/microsoft/2014/11/clutter-another-home-run-for-microsoft-office-graph/feed/ 0
Anglebrackets Conference – Day 4 http://blogs.perficient.com/microsoft/2014/11/anglebrackets-conference-day-4/ http://blogs.perficient.com/microsoft/2014/11/anglebrackets-conference-day-4/#comments Sat, 15 Nov 2014 17:00:08 +0000 http://blogs.perficient.com/microsoft/?p=24293 Keynote – ASP.NET vNext and you
Speaker: Scott Hanselman

ASP.NET will run anywhere (Mac and Linux). Web server will be included.

New free SKU of Visual Studio. Community edition will replace Express.

image_dbbecd7b-9298-4dde-993a-acd9d9461515ASP.NET and modern web

  • Totally modular (dependency injection built in)
  • Faster development cycle
  • Seamless transition from on-premises to cloud
  • Choose your editors and tools
  • Open source with contribution
  • Cross-platform
  • FAST

Framework (CoreCLR) is packaged together with application, not shared, safe to deploy.

Split between .NET Framework (Full CLR) and .NET Core (Core CLR). Core CLR is open sourced and cross-platform.

Project file (project.json) is replacing both web.config and nuget package file).

It’s possible to reference ASP.NET is source form and debug it.

VS supports Bower (client-side package manager) and Grunt (client side build tool).

Demo: packaging ASP.NET application together with framework, move to different computer and run it from there.

Demo:  run ASP.NET application on Mac. Omnisharp.net – cross-platform .NET IDE.

Resource: microsoft.github.io – portal to Microsoft open source


Session: Adding Offline support to mobile apps which using Azure mobile services
Speaker: Robert Green, technical evangelist at Microsoft

Review of Azure Mobile Services

Add offline support

  • SQLiteStore
  • Synchronization context
  • Pull, push, purge
  • Conflict resolution

Sync table

  • local table that tracks local changes
  • Only works with entities with string ids

GetSyncTable

  • Requires SQLite (nuget package)
  • Requires SQLiteStore package
  • Use GetSyncTable instead of GetTable, this way offline support will be enabled.
  • Need to add a column with version number to the entity

Demo: using sync table from windows store app and windows phone app

  • Table.PullData() is loading data from server (Azure) to local table and it’s also pushing offline changes to the server, doing full synchronization, but just for that table.
  • SyncContext.Push() is uploading all local changes to server
  • Table.Purge() is cleaning the local table. It’s not recording a deletion operation.

Conflict detection

  • Create sync handler to handle conflicts. Demo.
  • Default sync handler fails if there are conflicts

Session: Modern app diagnostic with Glimpse with ASP.NET
Speaker: Anthony van der Hoorn

Web application diagnostic tool

Glimpse could be downloaded as nuget package

Diagnostics is one of the hardest thing to do

40-60% of time spent debugging

  1. Current approach
    1. Debugging and diagnostic didn’t change in last 20 years
    2. Debugging tool are too low level
    3. Runtime is dynamic it could be hard to match request with execution flow
  2. Context is important
  3. Important questions
    1. How to educate people about what’s going on in application
    2. How do we know if framework is running as expected
  4. Glimpse
    1. Trace.axd for 21st century. Shows at the bottom of the page
    2. Aggregate request data
    3. Bridge client and server
    4. Framework-level insights
    5. Free, open source
    6. Built as HttpModule
  5. Glimpse demo

Glimpse can display:

  • Server configuration settings
  • Server environment
  • Request, as received by the server
  • Session
  • Http varuables
  • Trace
  • Execution timeline with performance data
  • Metadata
  • Route data
  • Views, MVC engine decision tree
  • SQL execution with performance data
  • Overall timeline, what’s happening when
  • Inspec Ajax requests
  • Dashboard display

Resource: http://getglimpse.com


Session: Building Persistent HTML5 applications

Speaker: Craig Shoemaker

Resources: http://bit.ly/ws-idx

Client-side persistence options

  1. Cookies.
  • Old, good, suppoted by every browser
  • Work for smaller pieces of data, ids
  1. Web storage
  • Fne for smaller pieces of data
  1. Offline caching
  • Doesn’t store data, stores pages and resources
  1. IndexedDB
  • Document database, noSQL
  • Can persist JSON
  • Key/value. Value is an object
  1. Web SQL – dead-end, not supported

IndexedDB

Even lifecycle:

  • Open request
  • Pass a database version
  • On upgrade needed, if new version is available, Then do DB upgrade.
  • Success event
  • May raise error

Need to “normalize API”, i.e. translate vendor-specific API names

Close DB before deleting.

Have to have a transaction

Chrome dev tools allow to inspect IndexedDB

Can do cursor

Can create index and search by index

Stable standard, no difference between browsers

Can create keys, keys can autoincrement or GUID

See http://bit.ly/ws-idx for example of indexDB abstraction

Web Storage

Aka DOMStorage, SessionStorage, LocalStorage

Also key/value pair

Storage limit is 5Mb (2.5Mb, because of unicode).

Sandboxed

Data remain on client

no databases, no versioning

sessionStorage vs localStorage – sessionStorage is not persistent

“storage” event, can subscribe


Session: Building Custom Monitoring, fast (using Glimpse)
Speaker: Anthony van der Hoorn, Glimpse

Demo: instrumenting MVC app for Glimpse.

Create a class extending AspNetTab.

Application is not modified by added instrumentation.

No need to format data, glimpse takes care of data representation.

If needs to access session in AspNetTab then need to override ExecuteOn to set OnEndSession

Possible to specify custom layout for tab.

defaultRuntimePolicy=”Off” – will not do anything

Implement IRuntimePolicy to enable Glimpse only for specific cases (like for specific users, user groups)

RuntimePolicy.PersistResults = record only, not display

Implement IResource to create custom data provider

]]>
http://blogs.perficient.com/microsoft/2014/11/anglebrackets-conference-day-4/feed/ 0
Office 365 and Salesforce: Integration Case Study Part II http://blogs.perficient.com/microsoft/2014/11/office-365-and-salesforce-integration-case-study-part-ii/ http://blogs.perficient.com/microsoft/2014/11/office-365-and-salesforce-integration-case-study-part-ii/#comments Fri, 14 Nov 2014 21:24:55 +0000 http://blogs.perficient.com/microsoft/?p=24303 o365

salesforce1

Given the central role that Office 365 occupies for more and more businesses, integration of the resources managed by Office 365 with other services is a challenge that Perficient often addresses for clients. The good news is that the Office 365 platform and the architecture of many other, key platforms provide countless integration possibilities, many of which can be leveraged without the need for custom coding.

Recently, I was asked to tackle an integration of Office 365 and Salesforce that serves as a good illustration of the possibilities.  I described the problem and solution in an earlier post . In that post, I described how to expose Office 365 information within Salesforce.  In this post, I would like to look at the problem from the opposite direction – how to expose Salesforce information within Office 365.

The Problem

In this case, the goal is to expose core Salesforce information within an Office 365 site.  Ideally, the information would behave as if it was “native” to the site.  In this case, “native” means web parts, lists, columns, etc. could all behave as expected.  This integration could, of course, be accomplished via creation of an appropriate SharePoint application.  As in the previous case, we are looking for a “no code” solution.

The Solution

Fortunately, Office 365 supports a technology designed to solve such external data integration scenarios – Business Connectivity Services (BCS) .  This technology was originally developed as a component/service for On-Premises SharePoint but is also supported within Office 365.

BCS supports the notion of an external content type, which can be used to describe Salesforce entities in a manner which will allow SharePoint to present the data as if it were internal.  Some of the information contained in a external content type for Salesforce includes the following:

  • Connection / Authentication information for the source Salesforce Organization.
  • A definition of the Entities, Fields, and data types (aka metadata).
  • Allowed data operations, such as Create, Read, Update, Delete, and Query (also called CRUDQ).
  • The identity field and display columns for an external content picker used to retrieve external data throughout the user interface.

As in SharePoint On-Premises, the BCS in Office 365 provides a means for importing the external content types definitions (see Import BDC Models):
bcssmallmarked

With the external content type in place, all the Salesforce-specifics can be ignored by users.

Creating a BDC Model

In addition to the external content type definition, the BDC Model for Salesforce must contain information about how to connect to the Salesforce data. BCS can consume data sources that are exposed as WCF services, SQL Azure data services, OData endpoints, and web services; Salesforce provides Web Service API for external data access. So, the question is “how do we match the requirements of the BCS client and the Salesforce service?”

In the case of Salesforce data, Visual Studio and SharePoint Designer tools do not provide a straightforward integration.  The good news is there are a number of 3rd party tool providers who solve this problem – e.g. RSSBus Salesforce Connector, BCS Meta Man .  These tools provide an GUI-based tool for the generation of the BDC Models and External Content Types. Under the covers, these tools provide an OData proxy for Salesforce web services.

 

Dealing with Security

Another important consideration of a Salesforce  integration is respecting security of the underlying sources system.  What is needed is some mechanism of associating the Office 365 authenticated user with an appropriate Salesforce user.  Fortunately, such mapping is supported by the Office 365 Secure Store service.  After determining the best authentication/identity mode for Salesforce, a target Secure Store application is created containing desired credentials mapping (see below).  Finally, the Secure Store application is associated with the Salesforce BDC Model.

securestoresmall

The Payoff

Once the Salesforce external content types are defined, a wide variety of SharePoint  elements can be used to create lists, columns, web parts, etc.  using Salesforce data.  See Salesforce demo for a quick example.

 

 

]]>
http://blogs.perficient.com/microsoft/2014/11/office-365-and-salesforce-integration-case-study-part-ii/feed/ 0
Anglebrackets Conference – Day 3 http://blogs.perficient.com/microsoft/2014/11/anglebrackets-conference-day-3/ http://blogs.perficient.com/microsoft/2014/11/anglebrackets-conference-day-3/#comments Fri, 14 Nov 2014 03:05:50 +0000 http://blogs.perficient.com/microsoft/?p=24287 I have been fortunate to attend this year’s Anglebrackets conference in Las Vegas. (See my coverage of Day 1 here and Day 2 here.)

The following are my notes from the Day 3 keynote, “Conversations with Microsoft.”

image_dbbecd7b-9298-4dde-993a-acd9d9461515The speaker: Steve Guggenheimer, corporate VP and chief evangelist.

A lot of changes at Microsoft over last year. New CEO, Satya Nadella

Not changed: We always be partner-led

Our Core – productivity platform for mobile-first, cloud-first world

4 engineering units at MS

  1. Devices platform.

Converging Xbox, Windows , WP and RT into single platform. Universal apps and One Windows.

VS 2015 preview is out. Demo: creating universal app from template.

  1. First-party devices (Surface, Lumia phones, Band, etc).
  2. First-party services (Office 365, Bing, Yammer, Skype, et). Cross-platform APIs and SDKs.
  3. Azure 

Microsoft Developer approach:

  • Innovation
  • Agility
  • Openness

Demo: VS2015 – targeting multiple devices with Apache Cordova plugin for VS. Using Android emulator for VS.

Demo: using remote IE (Azure service for testing on IE) on Mac, iPad and Android tablet.

http://remote.modern.ie

Announcement: Open sourcing .NET Core Runtime and Framework. .NET running on Linux and Mac. Free VS 2013 Community edition.


Session:  Implementing EF models for DDD Bounded Contexts

Speaker: Julie Lerman

Hard to ignore ORM concerns inside bounded context.

Two ways to approach DDD:

  1. “ivory tower” – two completely different models, one for domain, one for persistence, then map between them
  2. Feed domain model to the ORM framework

Bounded Context

Ubiquitous Language – terminology specific to the bounded context

Bounded context means breaking things apart and separating.

Problem: re-using is good, duplication is bad. But when re-used code started to re-purposed (same thing started have different meanings in different context) this is bad too.

Example: Customer Management. when is a Customer a Customer? In customer service is customer. In Sales it’s Purchaser, in Shipping it’s Recipient, in Marketing it’s either Prospect or Existing Customer, in Accounting it’s Account Holder. It would make sense to track customer with contact id, at least this is common.

Resource: http://Github.com/julielerman/TEE14Demo

Contact entity in other bounded context doesn’t inherit from Contact in another BC, but they both inheriting from common classes as WriteableEntity or ReadOnlyEntity. Simple classes like Address are reused.

Private setters for properties, factory methods used. EF can use private setters just fine.

Don’t reference one BC from another.

Shared Types

Shared Kernel

  • Tightly coordinated Entities and Value Objects
  • Common schema and behavior
  • Reduce duplication, don’t eliminate it

Inheritance

  • Infrastructure
  • Not domain types
  • Favor composition (interfaces) over inheritance. Inheritance is good for infrastructure types (like Address), but not for domain types (Customer).

Data Model != Domain model

Shared Data

How to map different entities in the different bound contexts to the same table (Customers)?

Green Field: Existing Database

  • Database to EDMX or Code First

Different DB Contexts for different BCs. Have different Customers table in different DB Ctx. Migrations are not possible.

Possibility: Different schema (or different databases) for different BCs.

Track changes in entities and generate events. Use these events to synchronize data in different databases (via message queue).


Session: Developing with ASP.NET vNext
Speaker: Taylor Mullen

Resource: www.asp.net/vnext

Global.json – list of source folders

Everything is light-weight and opt-in.

Project.json – combines project file and packages file

Startup.cs. Configure the framework features. Everything is opt-in.

No need to build. Builds on demand.

By default is completelly barebones – no framework (MVC or other) is forced upon.

Possible to use multiple different frameworks at once, just need to configure mapping.

MVC: controller doesn’t have to inherit from Controller.

Dependency injection is built in and automatic, everything could be declared as interface, then added [activate] attribute. Also need to configure the resolution with AddService();

@inject Razor statement for adding dependency injected calsses to view. This way it’s possible to derive View from something else than Razer view and then inject HtmlHelper, for example.

No need to use Html Helpers any more to create form and controls. Fluent html tags, no need to write C# code in the view, , need to reference TagHelpers and @addtaghelpers


Session: Visual Studio and Cordova
Speaker – Lino Tardos, MVP

VS tools for Cordova:-

  • released 11/12/2014
  • Avaiable is VS update 4

What is Cordova?

  • Open source version of PhoneGap
  • Allows to write apps for Android, iOS and Windows platforms
  • Apps are developed with html5, js and css only
  • Apps are run natively on device

How Cordova works:

  • Windows and Android
    • VS
    • MS build
    • VS-MDA
    • Cordva
    • Native tools
  • iOS
    • VS
    • MS build
    • VS-MDA
    • VS-MDA Remote (OSX)
    • Cordova (OSX)
    • Xcode (OSX)

Demo: create Cordova app from VS template

Demo: using Cordova + Kendo UI

Demo: Cordova application with Angular JS

Resources: www.falafel.com


Keynote: Managing programmers

Speaker – Douglas Crockford

  • Programmer
  • Manager of programmers
  • Manager of managers of programmers

Programmer:

  • Computer programmer
  • Computer Scientist
  • Software Engineer
  • Software Developer
  • Coder
  • Hacker

Not like the other kids (can’t manage them the same as other people)

Creative

Good programmers vs bad: 1:10 or 1:100

Some programmers have negative contribution

Two comics:

  • Dilbert
  • Xkcd

Ineffective metrics

  • Lines of code
  • Introduction of Bugs
  • Fixing bugs

Programming is not manufacturing

Discovery

Trial and error

Waterfall – doesn’t work

Agile – doesn’t exist

Directed Anarchy – working

Most managers want to see programmers to look busy and good attitude.

Good measure: read code. Daily code reading. Readability.

Hiring: make candidate to bring the piece of code and defend if with the team.

Programmers look like office workers, but they are creatives.

Office structure

Flow: private offices, free of distraction.

Communication: bullpens (startups)

Worst possible: cubicles and open space.

Ideal

  • Large project rooms with whiteboards
  • Lots of meeting rooms, different sizes, round tables
  • Classrooms to continuously train people
  • Library (silent)
  • Padded cells
  • Diner, booths with power plugs
  • Food
  • Bunk room
  • Shower
  • Day care

Programmers should own their own machines

Natural enemies of programmers:

  • Complexity. Don’t overcomplicate software.
  • Imperfection. Try to use best practices to minimize bugs.
  • Time. Premature optimization is bad. Measure, cut, measure.
  • Mismanagement. Programmers don’t like to be mismanaged.
  • Themselves. Encourage education, prevent “informed ignorance”.

Counterproductive measures:

  • add staff
  • Cancel continuous education
  • Hit intermediate milestones
  • Extend workday
  • Panic mode

Productive:

  • Talented staff
  • Clear, stable requirements
  • Minimal distractions
  • Focus on quality
  • Sufficient time

“if it’s stupid we won’t do it”. Programmers want to deliver quality product.

]]>
http://blogs.perficient.com/microsoft/2014/11/anglebrackets-conference-day-3/feed/ 0
Power BI Basics Inside Office 365 – A Video Series http://blogs.perficient.com/microsoft/2014/11/power-bi-basics-inside-office-365-a-video-series/ http://blogs.perficient.com/microsoft/2014/11/power-bi-basics-inside-office-365-a-video-series/#comments Thu, 13 Nov 2014 19:42:02 +0000 http://blogs.perficient.com/microsoft/?p=24282 Yesterday, we were fortunate to have a customer, Heidi Rozmiarek, Assistant Director of IT Development for UnityPoint Health, speak alongside our Microsoft BI team for the webinar, “Hybrid Analytics in Healthcare: Leveraging Power BI and Office 365 to Make Smarter Business Decisions.” power-bi

It was an informative session that began by covering architectural components and functions, architecture options including on premises, hybrid, cloud and delivery considerations. Following this, we had a live Power BI demo, and last but not least, Heidi shared how her organization is using the Microsoft BI stack to provide simple solutions for complex questions. Keep an eye out for a post describing the webinar in more detail, but in the meantime, you can view the replay here. 

Whether or not you attended the webinar, if you are interested in learning more about building a hybrid analytics platform with Power BI and Office 365,  I highly recommend you take a look at the following short video series.

  1. Introduction to Power BI:  The first video includes an introduction to Power BI, particularly around Power BI Sites, “My Power BI” and the Power BI Admin page.
  2. Administration and Permissions in Power BI: This video focuses on Site Admin and security basics.
  3. Data Exploration and Visualization in Power BI: The third video in the series discusses data exploration and visualization using Excel and related power tools, including Power Pivot and Power View.
  4. Data Management Gateway for Power BI: Here, we cover the steps to enable data feeds in Power BI using the Data Management Gateway.
]]>
http://blogs.perficient.com/microsoft/2014/11/power-bi-basics-inside-office-365-a-video-series/feed/ 0