Skip to main content


5 Tips to Maximize Your Copilot for Microsoft 365 Experience

On The Hand Of A Women With Icon A Light Bulb And There Is A Gear Icon On It. Bright Light Bulb Is Idea Innovation Creativity Inspiration Brainstorming And Imagination. Technology Education Intellectual Symbol Future Invention Strategy Energy Light Brain.

In my recent post, I discussed Microsoft’s recent announcement of Microsoft 365 Copilot. From your inbox to customer interactions, this approach to Generative AI is meant to be a game changer for businesses.

Copilot creates a new knowledge model for every organization to harness the massive reservoir of data and insights that lies largely inaccessible and untapped today. And it does so within the existing commitments to data security and privacy in the enterprise.

Business Chat – the chatbot experience of Copilot – works across all your business data and apps. It surfaces the information and insights you need so knowledge flows freely across the organization, saving you valuable time searching for answers. You will be able to access Business Chat from Microsoft, from Bing when you’re signed in with your work account, or from Teams.

This transformation shift to Generative AI in the workplace looks very promising. Unfortunately, most of the new Copilot for Microsoft 365 functionality is in private preview.

For organizations looking to start on their journey with Microsoft Copilot, we recommend preparing in these 5 key ways:

  1. Educate Admins and End Users on the Basics to Galvanize Excitement
  2. Promote Authoritative Content for Better Results
  3. Secure Your Content to Prevent Data Leakage
  4. Set Expectations: Communicate and Evangelize Your Company’s Stance on Generative AI and Microsoft Copilot
  5. Understand the Roadmap to Reduce Business Impact

1. Educate Admins and End Users on the Basics to Galvanize Excitement

Communication and organizational change management are key to any technological transformation. This starts with communication and informative actions.

AI is often considered “Creepy” and/or threatening to information workers and their viability within the company. The nomenclature Microsoft has chosen for their approach is deliberate: Copilot is there to assist rather than lead the generation of content or replace the need for human supervision.

Another key distinction: Microsoft Copilot is not the same as ChatGPT. Knowing how Copilot works and what data it shares will be paramount to admins and executives alike.

Microsoft Copilot does not use customer data, prompts, etc. to train or improve it’s large language models. This data stays within the Microsoft 365 tenant. When a user inputs prompts using any Microsoft AI tools, the information contained within their prompts, the data the prompts retrieve, and the generated responses remain within the Microsoft cloud. (See Microsoft’s stance on privacy and responsible AI for more details.)

2. Promote Authoritative Content for Better Results

AI is meant to be smart. But it is only as smart as its sources. Most organizations have multiple areas considered to be “authoritative content sources”. These areas range from the corporate intranet to policy portals to records management solutions.

Authoritative content is unique, verifiable content that informs the end-user. This type of content usually takes the form of policies, procedures, news, wikis and other areas promoting the latest and greatest on a particular organizational subject area.

Microsoft 365 identifies authoritative content in many ways. Four of the most common are Microsoft Search, Viva Topics, SharePoint intranet information architecture and content security. Proper configuration of these tools help ensure relevant content is surfaced in the user experience.

Microsoft Copilot will utilize the Microsoft Graph to search and retrieve content for the end-user executing the search. Proper organization and promotion of content such as document templates, official images and assets, and other office documents will allow Copilot to provide the best starting point for content it generates.

3. Secure Your Content to Prevent Data Leakage

Content search has always been a complex topic for SharePoint and Microsoft 365 collaboration as a whole. Many Microsoft 365 professionals may remember Office 365 Delve. Delve used search and artificial intelligence to surface people, documents and other items from Microsoft 365. This content was security trimmed, limiting the potential breadth of content to that which the end-user could see. This was a major problem for many organizations, as it unearthed troves of documents – from simple planning document drafts to spreadsheets with important personal employee data – that were not properly secured.

At a high level, an organization should start with Microsoft’s Security and Compliance features to enforce desired organizational governance standards. Yes, this can be a monolithic task. Fortunately, this work can be divided into three different areas of concern:

  • End-users – Individual information workers and content owners should check out their shared links and content to ensure they are not oversharing sensitive content from OneDrive, their SharePoint Sites, and Teams content.
  • Collaborative Groups, Functional Teams and Microsoft Team Owners – Administrators of major content areas as well as Microsoft Team and Channel Owners should revisit their Team and Channel memberships, their permission levels
  • Microsoft 365 Administrators – Admins often carry the heaviest load in this scenario. There are tons of features to assist in safeguarding content:
    • Microsoft Purview can assist with sensitivity labels, retention and other use cases. Each Microsoft 365 license provides advantages. Check out this quick reference for what’s available for your organization based on your current licensing.
    • Revisit your content retention and archiving strategy – while most content and IP should be an advantage to your organization, content living outside your organization’s content retention and archiving policies can often turn into liabilities. Revisiting retention and archiving will shore up your available content for users.
    • Test the Availability of Sensitive Content outside of Microsoft Search – Microsoft Search honors security settings, presenting only the content that’s available to the end-user submitting the search. SharePoint Online settings such as Hubs and Home sites can help to trim the results and improve search relevance. This can also provide a false sense of security around your content. Use the Microsoft Search’s Organization tab to search for known sensitive content. If available, try submitting those searches through Microsoft Search in Bing (formerly Bing for Business) via the Work tab to see content that may be obscured by any SharePoint Online and Microsoft Search settings.

4. Set Expectations: Communicate and Evangelize Your Company’s Stance on Generative AI and Microsoft Copilot

Generative AI tools are the talk of the technology sector. Microsoft Copilot is not the only Generative AI tool. OpenAI, Google, Facebook, Adobe and more will be providing AI-centered features to assist with day-to-day activities – from writing code to creating graphic design artifacts.

Generative AI is not the enemy. It is simply a tool. Get out ahead of the wave of AI tools. Provide a well-communicated, thoughtful strategy for how your organization will utilize Generative AI tooling based on each tool’s AI responsibility and privacy standards.

It is perfectly fine to have an initial communication stating these tools are being investigated for corporate use, but saying nothing will only ensure they will be brought into the organization faster than expected and under less governance than desired.

5. Understand the Roadmap to Reduce Business Impact

Each flavor or copilot has it’s own release timeline. This means your governance committee, organizational change managers, technology advocates, Microsoft cloud administrators, etc. will need to track the release windows for these tools individual including, but not limited to:

  • Copilot for Microsoft 365 (Outlook, Word, Excel, PowerPoint, Microsoft Teams, Business Chat, Loop and Microsoft Viva)
  • Copilot for GitHub
  • Copilot for Power Apps, Power Automate and Power Virtual Agents
  • Copilot for Security (Microsoft Sentinel, Microsoft Defender, and Microsoft Intune)
  • Copilot for Sales (Viva Sales, and Dynamics 365 (Marketing, Customer Service, Supply Chain Management and Business Central)

This will be difficult for most IT organizations. Fortunately, each of these will have their own business owner and sponsor. Engage the business as soon as possible to enable the productivity gains afforded by these features vs. fighting upstream against the wave that is Generative AI.

For more information on Copilot for Microsoft 365, check out Microsoft’s The Future of Work with AI event:

Ready to Reimagine Your Employee Experience?

Our dedicated Microsoft Modern Work practice brings the best expertise in the industry. From M365 Strategies to Intelligent Intranet to Microsoft Teams to Microsoft Viva, our consultants are here to ensure your success.

As a designated Modern Work Microsoft Solutions Partner and Viva Early Adopter, our Microsoft Partner Advisory Council and Partner Program contributions along with our 20+ years of delivering employee experiences to our clients means we seek to build the best strategy for your organization.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Ron Jones, Practice Director, Microsoft Modern Work

Ron is the practice lead for Microsoft Modern Work including M365 security, M365 foundations, tenant -to-tenant migrations, collaboration, and Microsoft-based employee experiences including modern intranets and Microsoft Viva. He is also an active member of the Microsoft community leading the Atlanta SharePoint and Microsoft 365 User Group (ATL365) and speaking at events.

More from this Author

Follow Us