Blog Categories

Subscribe to RSS feed


Follow our Microsoft Technologies board on Pinterest

Archive for the ‘Exchange Server’ Category

Webinar: Binary Tree & Perficient on Migrating to Exchange

It’s not uncommon for an organization to make some avoidable mistakes as they prepare for and then execute a migration from a legacy email platform to Microsoft Exchange (and that applies to Exchange 2013 on premises migrations  as well as those moving to Office 365 / Exchange Online).Binary Tree

If you are hoping to mitigate those migration risks, and ensure seamless coexistence between your current email platform and Exchange,  join Perficient and Binary Tree on Tuesday, September 9, 2014 at 1 p.m. CT for a webinar, Best Practices & Solutions for Migrating to Microsoft Exchange. During the session, you’ll learn how to dramatically reduce the costs, complexities and timeline of your migration, and hear why a move the cloud might be the right decision for you.

René Strawser and James Tolentino, both lead technical consultants in our Microsoft practice, will first take a look at common challenges surrounding migrations from legacy email platforms, details around making the move to Office 365, and will walk through a recent example of an organization that successfully migrated to Exchange Online with Binary Tree.

Binary Tree solution architect Perry Hiltz will then present a technical deep-dive of their award-winning SMART migration software solutions, CMT for Coexistence and CMT for Exchange software, which can be used on-premises or remotely. Binary Tree is the leading provider of messaging and collaboration transformation technology and solutions for the Microsoft platform in the cloud, on-premises, or hybrid environments. You can learn more about Binary Tree by visiting

To register for the webinar, click here.
Best Practices and Solutions for Migrating to Microsoft Exchange
Tuesday, September 9, 2014
1:00 p.m. CT

SharePoint 2013 Service Pack 1 (SP1) Released!

In further anticipation of SharePoint Conference 2014, today Microsoft announced the release of Service Pack 1 for all Office products, including SharePoint 2013. This has been a long time coming. Most of us in the industry were expecting SP1 to arrive sometime last fall, right around the 1 year mark of the product release. There was so much buzz last fall, Microsoft even had to blog to tell us the release date was “coming”. This update covers the following products – Access, Excel, InfoPath, Lync, OneDrive (formerly SkyDrive), OneNote, Outlook, PowerPoint, Publisher, Word, Project, Visio, SharePoint Designer, Project Server, Office Web Apps Server, and of course SharePoint Server.

Some of the highlights include:

  • Compatibility fixes for Windows 8.1 and Internet Explorer 11.
  • Better support for modern hardware, such as high DPI devices and the precision touchpad.
  • New apps for Office capabilities and APIs for developers.
  • Power Map for Excel, a 3D visualization tool for mapping, exploring, and interacting with geographical and temporal data in Excel, is now available to Office 365 ProPlus subscription customers.
  • Improvements to the Click-to-Run virtualization technology that installs and updates Office 365 desktop applications.
  • SkyDrive Pro is now OneDrive for Business.

Note this is for on premises deployments only. Customers who have an Office 365 subscription are always kept up to date and will get the SP1 changes automatically in their next regular update.

As with all Service Packs, SP1 also includes all public updates and cumulative updates that Microsoft has released between the date when Office 2013 became available (fall 2012) through early this year.

For a detailed list of all updated features, download this Excel file.

As this was just released today, we have not fully tested the Service Pack at Perficient. We will be applying this to our internal dev environments for testing very soon and will report back with any issues. Feel free to email or comment if you have any questions.

Using System Center Automation to Manage Office 365

Manage Office 365 with Microsoft System Center Service Manager, Orchestrator, PowerShell or Custom GUI.

Working with office 365 projects one of the things I come across frequently is what are some of the ways to manage Office 365 from an on premise location. Up to now there has been a very limited tool set to do simple task. DirSync is a tool offered by Microsoft to Synchronize the User Principle Names from Local Active Directory to the Office 365 cloud. Federated Services helps create a Single Sign on to the Cloud which helps the administrators to manage passwords locally. Exchange Management console has some management functionality of Office 365 mailboxes but it requires a Hybrid Deployment. Power Shell offers the most flexible on premise management abilities. Then there are some third parties out there that provide simple management tools to do things like Synchronize passwords or Migrate mailboxes. Read the rest of this post »

Copying distribution groups to cloud for Outlook/OWA management

While directory sync provides a much needed service for Office 365 tenants one pain point that comes up pretty regularly is distribution group management once you’re in the cloud. Sure the groups get synced to the cloud but if you’ve been used to managing the group memberships with Outlook when everyone’s mailbox used to be on-premise, once you move your mailbox to the cloud you won’t be able to do that anymore. This is because the object is synchronized from your local AD and therefore you must make changes to the group in Active Directory and let dirsync bring those changes to the cloud. If you have a hybrid server or local Exchange environment you could use it to manage the membership but most likely you’re not going to allow users to access the EMC. You could also create your own application which allows users to edit groups in your local AD but honestly who wants to spend development time doing that?

So what other options are there? Well the only way is to recreate each group directly in the cloud. What if you have hundreds or thousands of groups and thousands of members of those groups? I know, it doesn’t’ sound like this would be any fun at all and it’s not. You can automate this process using PowerShell and maybe some simple Excel skills. I like keeping things organized and so I use Excel to prepare input files for my bulk PowerShell applications. For this particular task what I did was get a list of the existing distribution groups from my on-premise Exchange environment with a few attributes to allow me to bind to the AD object and leverage other attributes in my script. I would grab at a minimum the displayName, mail, and mailNickname. Using Excel I would then use this information to create the new displayName, mail and mailNickname for the cloud-based distribution groups. To show you what I mean here’s an example input file (CSV) for my script.

oldgroupDisplayname oldgroupMail oldgroupAlias newgroupDisplayname newgroupMail newgroupAlias
DGroup1 dgroup1 Cloud Group1 cloudgroup1


Now for the very simple script to connect to the cloud, create the new group and populate the membership based on the existing group:

# Connect to cloud




$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri -Credential $o365creds -Authentication Basic -AllowRedirection

# Note: The prefix below is used to differentiate between the Exchange Online commands and the local Exchange commands (e.g. get-CloudMailbox vs. get-Mailbox)

Import-PSSession -Session $Session -Prefix "Cloud" -DisableNameChecking:$true -AllowClobber:$true



Import-CSV groups.csv | % {


# Create the new group


New-CloudDistributionGroup -DisplayName $_.newgroupdisplayname -Name $_.newgroupname -Alias $_.newgroupalias -PrimarySmtpAddress $_.newgroupmail -Type Distribution


# Grab members from old group


$groupmembers=@(Get-CloudDistributionGroupMember -Identity $_.oldgroupmail)


# Now add the members from the old group  to the new group


foreach ($groupmember in $groupmembers) {

    Add-CloudDistributionGroupMember -Identity $_.newgroupmail -Member $groupmember.primarysmtpaddress




Note the prefix (“Cloud”) that I used in the example. This simply means to prefix the cmdlet you’re running with “cloud” (i.e. Get-CloudMailbox instead of Get-Mailbox). Using a prefix allows me to use multiple remote PowerShell sessions, one against the cloud and one against the on-premise Exchange environment so I can keep track of which objects I’m updating. This script could be expanded easily to configure other settings on the new cloud distribution group and to duplicate other settings from the on-premise group like the manager, proxyAddresses, group opt-in/opt-out settings, etc.

I hope this proves useful for someone out there faced with the same challenge.

Lync Server 2013 Internal Server Roles

This is post 10 of the twelve post series, to see an index of all twelve posts, click here.

On the 10th day of Lync’mas my UC Team gave to me: 10 Lync Internal Server Roles!

On the surface (No PUN intended), Lync 2013 is, or at least was upon release, widely perceived to not be much of a change over Lync Server 2010 and was more of a simple refresh.   “Ho Ho Hoooo-boy!”…This simply couldn’t be further from the truth once you dive into each of the different roles of Lync Server 2013.  Rather than use this blog post to deep dive into those roles, I will highlight all the servers and the roles associated with Lync Server 2013, in contrast to Lync Server 2010.  Don’t forget that these roles do not necessarily require their own separate servers, as they can be co-located.

The core of any Lync Server 2013 deployment continues to be Enterprise Edition Pool “EE” servers, or a single Standard Edition “SE” server.  With Lync Server 2010 and Lync Server 2013 there were multiple servers and roles associated with a deployment.  These servers/roles include:

Read the rest of this post »

Using PowerShell in Windows Server 2012 to create a simple lab

I’ve been meaning to sit down and spend some time exploring the new Active Directory cmdlets that come with Windows Server 2012 so I decided to use my lab to create some test objects and populate the mailboxes with some messages.

My lab setup is very simple:

  • 1 – Windows Server 2012 domain controller
  • 1 – Exchange 2013 server (hosted on Windows 2012)
  • 1 – Windows 8 client with Office 2013

My goal was to be able to quickly create some test users and groups in a new OU structure, populate the groups with the accounts, and finally populate the mailboxes with some test messages. Here is the script I created to do that. It should be fairly straightforward to follow. There are obviously many other ways to do this. This is just one such way. I ran the script from the Exchange 2013 Management Shell after installing the Active Directory PowerShell module.

Read the rest of this post »

Converting a DiskPart script to PowerShell on Windows Server 2012

Windows Server 2012 and PowerShell 3.0 offer thousands of new cmdlets to make any scripter happy. One set of cmdlets had me intrigued when I first heard about them. The cmdlets in the storage module interested me because frequently I would have to deploy a bunch of new Exchange servers and depending on the design sometimes I would need to prepare a lot of disks and create mountpoints. Depending on the numbers of servers and disks involved I would either do everything manually or use a diskpart script. I always for automating things like this as much as possible so when I heard about the new storage cmdlets in 2012 I was pleasantly surprised.

For anyone who has used diskpart scripts you’ll know they aren’t always easy to implement and require some testing and tweaking to get them fully automated. Good luck trying to put a lot of logic in a diskpart script too. The only switch I found useful was the NOERR one, which basically told the script to continue and “ignore the man behind the curtain” when an error was encountered. Here’s a snippet of one such script. In it I’m creating two disks, one for a database and one for a log, then linking them to a directory as a mountpoint.

Read the rest of this post »

Why I love PowerShell…and so should you

 This blog post is meant for both the PowerShell newbie and scripter out there looking for a reason why they should start learning aptly named PowerShell or push themselves to learn a new aspect of PowerShell they’ve been meaning to try.

It’s been a few years now since PowerShell first came to be. Remember those Monad days when we first got a glimpse at what Microsoft had up their sleeve? I’ll admit I was one of the skeptical ones, deeply entrenched in VBScript, DOS batch files, AutoIT, VB.Net, etc. I thought to myself, “Great, another programming language. This will never catch on. Microsoft did what to the administrative interface?!” I just didn’t get it at first.

When Exchange 2007 hit the market I knew they were serious. Microsoft cleverly led me (although initially it felt more like ‘forced me’) to learn this new scripting language by including helpful syntax examples whenever I would use the Exchange Management Console to do simple and sometimes complex tasks:

For example, moving a mailbox:

‘ Account1′ | move-mailbox -TargetDatabase ‘E2K7SVR1\First Storage Group\Exchange2007DB2′

Ok. That was simple enough and looking at the code, somewhat easy to follow the logic although at the time I didn’t have any clue what the syntax rules were yet or how to do anything I was used to doing with VBScript. Ah, my cherished VBScript. Not anymore! Fast-forward a few years later. Read the rest of this post »

Outlook 2013 and no Default Gateway

Recently I was setting up Exchange 2013 RTM in my lab and ran into an error when trying to configure my first mailbox with Outlook 2013 running on Windows 7.  While walking through the mailbox configuration wizard I was prompted with the following error while it was trying to search for my Outlook settings:


Since may lab configuration is rather simple running on VMware Workstation 9.0, I have a single subnet Custom subnet for my Windows Server 2012 Domain Controller, Exchange 2013 combined Mailbox/CAS server and a single Windows 7 workstation.  In past configurations like this with Exchange 2010 and Outlook 2010 I have never encountered this issue before when setting up an Outlook profile.  After doing some research on the web, I stumbled upon the following link which describes a similar issue with Outlook 2007/2010:  I figured I would give it a try with Outlook 2013 and see what happens.  I added the registry key as outlined in the KB article, however I did have to add a Key under Outlook called “RPC” as it didn’t exist and then add the appropriate DWORD value of DefConnectOpts with a value of “0” as shown below:


Once I did this, my Outlook 2013 profile was successfully configured!

Using Exchange 2010 Native Data Protection

Recently I had the pleasure of working with a customer who decided to eliminate backups within their Exchange Organization.  They were upgrading from Exchange 2007 to Exchange 2010 and wanted to take advantage of many of the new features that Exchange 2010 had to offer such as larger mailbox databases and cheaper storage.  The customer was increasing mailbox quotas for approximately 12,000 users from between 120 or 200MB to 2 or 3GB with a handful of users with unlimited mailbox size limits.  There was also a subset of approximately 19,000 users who had 35MB mailbox quotas which would be increased to 75MB in Exchange 2010.  Their email retention time was currently 180 days but they were in the process of reviewing that with their legal department and possibly reducing that to 30 days.  While going through their design session we came upon the topics of backups and what they planned to do going forward with Exchange 2010 while increasing the mailbox size limits.  After running through the Exchange 2010 Mailbox Server Role Requirements Calculator with the Messaging and Storage Teams, the amount of storage they would have to purchase for their backup system was just not going to be feasible from a budgetary standpoint.

After numerous discussions about Exchange 2010 regarding backups, storage and database copies.  The customer decided they wanted to explore the idea of using Exchange Native Data Protection with Exchange 2010 and eliminate backups completely from their environment.  With the guidance of the following TechNet article: the choice was clear that by going to Exchange 2010, traditional backups for would no longer be necessary.  We were able to meet many of following considerations for using Exchange Native Data Protection which allowed the customer to meet their business and technical requirements for upgrading to Exchange 2010.

1.  Your recovery time objective and recovery point objective goals should be clearly defined, and you should establish that using a combined set of built-in features in lieu of traditional backups enables you to meet these goals.

  • Resolution:  In working with the customers legal department the decision was made that the retention time for email would be reduced to 30 days for all users.  We used this information to enter into the Exchange 2010 Mailbox Server Role Requirements Calculator for the retention time to be 30 days with a combination of Single Item Recovery

2.  You should determine how many copies of each database are needed to cover the various failure scenarios against which your system is designed to protect.

  • Resolution:  The customer decided that they would have 3 copies of each database spread across two Datacenters which were configured in an Active/Active configuration.  In this case, the customer had F5 GTM and LTM load-balancers as well as a very high-speed connection between the Datacenters as well as multiple backup connections with different carriers if the primary WAN link were to go down.

3.  Can you afford to lose a point-in-time copy if the DAG member hosting the copy experiences a failure that affects the copy or the integrity of the copy?

  • Resolution:  Because there are 3 copies of each database and a very high-speed connection between Datacenters this will not be an issue in this specific deployment.

4.  Exchange 2010 allows you to deploy larger mailboxes, and the recommended maximum mailbox database size has been increased from 200 gigabytes (GB) in Exchange 2007 to 2 terabytes (when two or more highly available mailbox database copies are being used). Based on the larger mailboxes that most organizations are likely to deploy, what will your recovery point objective be if you have to replay a large number of log files when activating a database copy or a lagged database copy?

  • Resolution:  Again for this customer, this was a moot point because of their Datacenter configuration.  However, we did deploy a single lagged copy with a 7 day playback time for around 150 Executive level users in the very rare event of logical corruption, see the following point.

5.  How will you detect and prevent logical corruption in an active database copy from replicating to the passive copies of the database? What is your recovery plan for this situation? How frequently has this scenario occurred in the past? If logical corruption occurs frequently in your organization, we recommend that you factor that scenario into your design by using one or more lagged copies, with a sufficient replay lag window to allow you to detect and act on logical corruption when it occurs, but before that corruption is replicated to other database copies

  • Resolution:  After speaking with a number of my colleagues as well as a few folks at Microsoft, logical corruption is something that a very, very unlikely to happen with the changes in disk technology since the release of Exchange 5.5.  Ever since the release of Exchange 2000/2003, Microsoft has built in a number of safeguards that help prevent logical corruption.

Every customer environment is different and careful planning and understanding of both the business and technical requirements are crucial.  By moving to Exchange 2010, this customer was able to reduce the costs and improve the performance of their messaging environment in a number of ways.  First, by using multi-role servers we were able to reduce the number of servers deployed in the environment from 15 down to 10.  Second, all of their storage was low-cost, high-capacity, direct attached storage.  And third, by implementing multiple database copies along with deleted item retention and single item recovery, traditional backups and the amount of disk needed to backup larger mailboxes was eliminated.