Let’s start with ChatGPT
Unless you’ve been living off-grid for the past few months, you’ve likely had a play around with ChatGPT, or have at least heard the rave reviews about it from friends and family. ChatGPT launched just 6 months ago, but has already been making shockwaves around the globe, pushing the boundaries of human-machine interaction. ChatGPT is designed to generate human-like, natural language responses, in response to a natural language input, hence the term ‘Chat’.
ChatGPT is trained on a large language model (LLM), which simply put, is a huge dataset of billions of pages of information.
What ChatGPT is not?
Whilst ChatGPT demonstrates impressive language capabilities, it does not possess true understanding, consciousness, or emotion. It relies on statistical patterns in the data it was trained on and may generate responses that sound plausible but may not always be factually accurate or reflect common sense.
It also is not up to date with ‘live’ information. It cannot tell you today’s news, what the weather forecast is, or details about things that have happened recently, such as product launches – go on, ask ChatGPT what’s new in the iPhone 14 Pro (launched in September 2022) and it won’t have a clue. In fact at the time of writing this blog, ChatGPT’s knowledge only goes up to September 2021, and a lot can happen in 19 months – ChatGPT knows nothing about the war in Ukraine, the global energy crisis, or a world without Covid-19.
I thought this was about Microsoft 365 Copilot
I’m getting there, I promise! To understand some of the key concepts around Copilot that I’ll be discussing in this blog, it first helps to have a foundational understanding of what ‘GPT’ or ‘LLM’ technology is, and equally importantly, what it isn’t, and there’s no better example out there right now than ChatGPT.
Microsoft and OpenAI
In January of this year, Microsoft announced its third investment in OpenAI, the organisation behind ChatGPT, as part of a multiyear, multibillion dollar investment in the company. It should therefore come as no surprise to learn that GPT (OpenAI’s LLM architecture) is used to underpin Microsoft’s multitude of ‘Copilot’ offerings.
The Microsoft 365 Copilot elevator pitch
Microsoft 365 Copilot is intended to be exactly that – a co-pilot – and is designed to work alongside humans, not replace them. We’re talking Clippy the paperclip (remember him?!), but a teeny bit smarter. It enables you to use natural (conversational) language to interact with technology. It combines the power of large language models (LLMs) and any additional context provided during your conversation, together with the data contained in your M365 tenant and apps, to transform the way you interact with Microsoft 365 applications.
Is it secure?
You may have read last month about Samsung suffering several data leaks, after colleagues used ChatGPT to help fix problems in source code, and summarise meeting notes into a presentation. ChatGPT retains user input data to further train itself, and the confidential inputs provided by Samsung colleagues are now unwittingly in the hands of OpenAI and the LLM powering the public ChatGPT service.
Whilst this disclosure from such a high-profile tech firm would have many CSOs running to pull up the drawbridge and disable Copilot in their organisation before it’s even generally available, it’s first important to understand the high-level architecture of Copilot and to differentiate it from the public ChatGPT service. I know I’m the one who started this blog talking about ChatGPT and drawing comparisons between the two solutions due to their underlying mechanics, but their data architecture is entirely different, so it’s now time to separate the two.
The large language models (LLMs) used by Copilot are stored in the Azure OpenAI service – it has dedicated instances of the LLMs and does not use the same models which power the public OpenAI (ChatGPT) service. Secondly, Copilot respects per-user access permissions, so you can avoid scenarios where users ask for information about a top-secret project, or try to find out the CEO’s salary (who would dare do this?!) and Copilot freely offers up the information – if a user doesn’t have access to the source of the data, Copilot cannot offer up the information in response to a prompt. And finally, your organisation’s data is never stored in the LLM or used to train it, so you retain sovereignty over your data.
Okay I’m sold, where do I begin?
Over the past few months, Microsoft have been building excitement around its Copilot offering, teasing a Copilot glow-up for most of its cloud products and services. Microsoft Word, Excel, PowerPoint, Outlook, Teams, OneNote, Viva, Dynamics 365 and the full Power Platform suite, are all set to receive the Copilot treatment, reinforcing Microsoft’s commitment to an AI-assisted future for work.
Right now, the only Copilot feature to be publicly available, is Bing chat in Microsoft’s Edge browser. Beyond that Copilot for Microsoft 365 is currently in a paid-for, limited Private Preview across approximately 600 organisations globally, offered on an invitation-only basis. It is currently unclear whether Copilot features will be available for existing licenses, or whether it will follow the trend of Microsoft’s recent AI-assisted features being released as ‘Premium’ addons.
To learn more, catch up on our latest webinar - ChatGPT & Generative AI, the revolution begins
Written by Dean Phillips, Principal Solution Architect, Advanced
Is your M365 environment fully secured?
Advanced are running a CIS Benchmark assessment where we will analyse your existing Microsoft 365 tenant and associated configuration, providing you with a gap analysis to reach the CIS Benchmark 1.4.0 for Microsoft 365. We will then run a workshop with you to outline the recommendations and a plan of action. Interested to learn more? Drop us an email: firstname.lastname@example.org