logo

2 Questions: Is Copilot Worth it and is Copilot Safe?

Introduction: The Rise of AI Assistants in the Workplace

Bob Dylan’s classic song The Times They Are a-Changin’ feels more relevant than ever as AI continues to transform our daily lives. Today, countless computer users rely on AI assistants to boost productivity and streamline their workflows. In many ways, these tools act as copilots, offering support and guidance throughout the workday.

AI copilots, like GitHub Copilot and Microsoft Copilot, are now simplifying and automating many of the tasks that were once manual and time-consuming. For instance, GitHub Copilot provides real-time code suggestions and explanations. This allows developers to focus on solving complex problems rather than mundane coding tasks. Microsoft Copilot takes many of the mundane tasks from users by integrating with Office apps such as Word, Excel, and Teams to draft documents, analyze data, and summarize meetings.

As AI technology improves these AI copilots are now evolving into proactive collaborators. They predict user needs, offer personalized recommendations, and even anticipate project risks before they arise. This shift from reactive assistance to proactive augmentation is enabling knowledge workers to focus on strategic and creative endeavors. This will not only improve efficiency and reduce manual errors but also foster innovation by freeing users from routine tasks. Welcome to the new era of Generative AI.

However, before we hastily embrace this transformative technology that seems to have appeared overnight, we should carefully consider several important questions:

How secure and compliant are AI copilots?

  • What are the limitations and risks of using AI copilots?
  • What are the long-term implications of AI copilots in software development?
  • Is Copilot safe?
  • is Copilot worth it?

Microsoft 365 Copilot: How to Ensure a Secure Rollout and Ongoing Data Protection

We care about security of your data.

Privacy Policy

Copilot in Action: What It Actually Does

Before we start evaluating how safe these AI copilots are, let’s first discuss what they do to determine how much benefit we get from them.

Microsoft Copilot

Microsoft Copilot integrates AI across Microsoft 365 apps (Word, Excel, PowerPoint, Outlook, Teams) to enhance workflow automation and decision-making the following ways:

  • Automates routine tasks across Microsoft 365 applications such as Word, Excel, PowerPoint, Outlook, and Teams
  • Transcribes, summarizes, and highlights key points in Teams meetings or long email threads in Outlook
  • Helps draft original content by offering relevant suggestions based on input from the user
  • Assists in Excel by generating formulas, insights, and charts.

Microsoft Copilot is embedded within Microsoft 365 applications including Word, Excel, PowerPoint, Outlook, Teams, and Dynamics 365. It also supports external integrations through APIs like Microsoft Graph API and OpenAI plugins for extended functionality across third-party software.

GitHub Copilot

GitHub Copilot integrates into development environments to assist programmers with the following tasks and more:

  • Supports tasks such as code completions, debugging, and refactoring,
  • Identifies syntax issues and suggests improvements
  • Acts as a virtual coding partner to speed up the development process.
  • Offers conversational assistance through Copilot Chat, enabling developers to ask coding questions, understand syntax, or get explanations for selected code

GitHub Copilotis compatible with popular IDEs like Visual Studio Code, Visual Studio, Vim, Neovim, JetBrains suite of IDEs, and Azure Data Studio. It also supports Azure DevOps and is available on GitHub Mobile.

Measuring the Impact: Productivity Gains and Limitations

Even though we have barely had time to integrate generative AI technologies into our enterprises, the results of its implementation can already be seen:

  • According to a 2023 McKinsey study, software developers can complete coding tasks up to twice as fast with generative AI. The study showed that developers could write new code in nearly half the time and optimize existing code in two-thirds the time
  • A Faros AI Study found that developers using Copilot completed tasks 55% faster compared to those without it.
  • According to a recent survey, businesses that effectively implement AI technologies experience a substantial increase in productivity, with some reporting up to a 40% improvement in efficiency and 30% reduction in operational costs.
  • A Microsoft study showed that heavy users of Microsoft Copilot experienced a 20% increase in close rates for sales teams.

Copilot Performance Metrics

Office productivity metrics According to the Faros AI Study mentioned earlier, GitHub Copilot can reduce IT project costs by 9.3% to 11.8%, with savings increasing to 18-19% during the build phase of software development. These savings are compiled from reductions stem from faster coding, improved quality, and reduced debugging time.

Forrester released a 2024 study on the economic impact of Microsoft Copilot and found that businesses that utilized it across their organization experienced an ROI ranging from 132% to 353%.over a three year period. In addition, they experienced a 25% acceleration in new -hire onboarding.

Gaining Back Time and Money

By reducing task duration by up to 55%, GitHub Copilot can save developers approximately 2.4–6 hours per week depending on their coding workload. For a freelancer charging $50–$100/hour, the annual $120 licensing fee can be recouped in 2 hours or less.

A simple analysis shows that Microsoft Copilot users can gain 1 extra hour of productivity each day. Based on a 20-day monthly workload at a rate of $20 per hour, that equates to $400.00 of additional productivity per month, for the price of $40(the monthly cost of Copilot).

Large enterprises can save thousands of hours on an annual basis, translating into hundreds of thousands of dollars of savings. The evidence clearly shows that money invested in AI Copilots is money well spent.

Security and Privacy: How Copilot Accesses and Uses Your Data

Both GitHub Copilot and Microsoft Copilot rely on large language models (LLMs) that are trained using vast amounts of publicly available and proprietary data. Because these AI assistants work with so much of your data, a real concern is whether your data will be exposed to these models, giving unauthorized access to other Copilot users.

GitHub Copilot processes the code you write locally on your machine to generate real-time suggestions. This minimizes the risk of exposing or storing your code externally. While it does temporarily use contextual information, such as code snippets and comments, to generate suggestions, this data is not stored long term or used to train the language models.

As for Microsoft Copilot, prompts and responses are processed within Microsoft’s Azure OpenAI environment and are not used to train foundation models. Data remains within the organization’s tenant and is encrypted during storage.

What about permissions and access management?

  • GitHub Copilot requires read access to open files in the developer’s IDE and inherits permissions granted by the user’s local setup
  • For Microsoft Copilot, permissions are managed through Microsoft Entra ID, which authenticates users and enforces role-based access controls. Users can only access data they are authorized to view within the Microsoft 365 ecosystem

Known Vulnerabilities and Past Incidents

According to a 2024 New York University study, 40% of the code produced by GitHub Copilot is vulnerable to threats. The AI Assistant is also prone to suggest code with hardcoded credentials or outdated libraries. Due to the security concerns pertaining to sensitive data exposure and over permissions, the U.S. Congress banned staff from using Microsoft AI Copilot in 2024.

According to a 2024 New York University study, approximately 40% of GitHub Copilot-generated code contains security vulnerabilities. The AI assistant frequently suggests potentially problematic code, including solutions with hardcoded credentials and recommendations for outdated libraries. Due to its known security concerns pertaining to sensitive data exposure and over permissions, libraries. Due to the security concerns pertaining to sensitive data exposure and over permissions, the U.S. Congress banned staff from using Microsoft AI Copilot in 2024.

How to Safely Deploy Copilot in Your Organization

Your organization must have a plan outlining how you will deploy Copilot along with proactive strategies to prevent data misuse, unauthorized access, and compliance risks. You should have established role-based access controls (RBAC) in place to ensure that users only interact with data relevant to their responsibilities. Other best practices include:

  • Implement least privilege access to restrict Copilot’s ability to pull from sensitive files unless necessary.
  • Define data access policies that specify what Copilot can analyze, summarize, or suggest based on user roles.
  • Conduct regular audits to review permissions and detect unauthorized access or potential oversharing.
  • Create dedicated service accounts for Copilot with strict boundaries
  • Implement Just-in-Time (JIT) Access that grants temporary permissions for sensitive tasks

Organizations should implement a data classification framework that categorizes content based on confidentiality levels. This classification system helps prevent Copilot from inadvertently exposing sensitive information during use.

For compliance and security of AI outputs, Microsoft offers several integrated tools: Microsoft Purview enforces data loss prevention (DLP) policies, while Copilot Compliance controls provide governance mechanisms. Copilot also includes encryption and data masking capabilities to ensure that sensitive data processed by Copilot remains protected, maintaining privacy even during AI interactions.

Comparing Copilot with Alternatives

Copilot is not the only AI Assistant out there as there are other Generative AI services being offered today. Here is how Copilot stacks up to them:

Microsoft Copilot: Designed for seamless integration within the Microsoft 365 ecosystem, Copilot offers built-in compliance and security features aligned with enterprise policies. It enhances productivity across Microsoft apps, making it an excellent choice for businesses already using Microsoft products. However, its functionality is largely confined to Microsoft applications, and its licensing model may not be ideal for casual users.

ChatGPT: highly flexible, general-purpose AI assistant capable of handling a wide range of tasks. It is available as a standalone tool without requiring enterprise software licenses. Despite its huge popularity, it lacks real-time business data access and application integrations. Because its security and compliance controls are less robust compared to enterprise-focused solutions, it may not be suitable for organizations with strict regulatory requirements.

Bito: Specially designed for software developers with a strong focus on compliance and secure coding practices. It offers features such as AI-powered code generation, security insights, and DevOps automation. While it prioritizes code security and efficiency, it is not as versatile for general business productivity tasks and lacks deep integration with Microsoft Office applications.

In short, each offering caters to different needs. A determining factor for choosing Copilot is whether you desire its natural integration with Microsoft 365 products. Also, while GitHub Copilot is AI-powered for coding, Microsoft 365 Copilot lacks deep programming functionalities compared to Bito or ChatGPT’s code-generation abilities.

For freelance professionals and small businesses, ChatGPT is often the best choice due to its affordability, broad capabilities, and lack of software restrictions. Bito may be the most useful for freelance developers who need secure AI-driven coding assistance. For larger enterprises dependent on the Microsoft 365 product suite, Copilot is a great choice.

Individual Perspectives: Freelancers, Students, and Teams

Copilot serves diverse users from individual developers and students to large enterprise teams, though its value varies based on specific needs. Solo developers can utilize Copilot within Visual Studio Code for coding assistance, debugging, and documentation creation. Small businesses benefit from Copilot in Microsoft 365 for streamlined document creation, email composition, and data analysis.

However, Copilot’s enterprise-oriented licensing model presents limitations for individual users without Microsoft 365 subscriptions. While Microsoft offers academic institutions access to Copilot, availability depends on the school’s existing Microsoft 365 subscription level. Students outside institutional settings may find Copilot less accessible compared to alternatives like ChatGPT or GitHub Copilot for Education.

The Verdict: Who Should Use Copilot and Under What Conditions

The truth is that nearly anyone that uses any of the Microsoft 365 applications in their daily work can greatly benefit from Microsoft Copilot while developers can accelerate their coding time with GitHub Copilot. Of course, money comes into play and licensing plays a big role in deciding whether investing in Copilot is worth it for your organization. If you’re considering deploying Copilot within your organization for the first time, target those users who will gain the most value using a phased rollout. This can help establish best practices and determine actual ROI before wider implementation. The future of work undoubtedly includes AI assistance, but successful integration depends on thoughtful implementation that balances innovation with pragmatic considerations of cost, security, and organizational readiness.

Frequently Asked Questions (FAQ)

Can Copilot access confidential code or data?

Yes, Copilot can potentially access confidential code or data which is a real security concern. In code environments, Copilot can “see” the code in your open files and the context of your project when generating suggestions, while Microsoft Copilot can Copilot can access documents, emails, and other content when assisting Microsoft 365 users.

Is Copilot compliant with data protection laws?

Copilot is designed with compliance features, but its alignment with data protection laws depends on your configuration and usage. Microsoft has built GDPR, CCPA, and other regulatory compliance capabilities into the platform, including data residency options.

Does Copilot work offline?

Neither GitHub Copilot nor Microsoft Copilot can operate offline as they both rely on cloud-based AI models hosted on servers to process user inputs and generate responses. Even when working in local development environments or applications, the AI functionality itself needs internet connectivity to function properly.

What’s the difference between GitHub Copilot and Microsoft Copilot?

GitHub Copilot is aimed at software developers to assist with coding tasks. Some of its capabilities include the ability to provide real-time code suggestions, auto-completes functions, and translates natural language into code. It also Integrates seamlessly with popular Integrated Development Environments (IDEs) like Visual Studio Code, JetBrains IDEs, and Neovim. Microsoft Copilot on the other hand targets business professionals who use Microsoft 365 applications for communication and productivity. It is designed to enhance productivity across the Microsoft 365 suite as it can assist in drafting documents using Word, summarizing emails or chat threads, analyzing data in Excel, and creating presentations in PowerPoint.

Dirk Schrader is a Resident CISO (EMEA) and VP of Security Research at Netwrix. A 25-year veteran in IT security with certifications as CISSP (ISC²) and CISM (ISACA), he works to advance cyber resilience as a modern approach to tackling cyber threats. Dirk has worked on cybersecurity projects around the globe, starting in technical and support roles at the beginning of his career and then moving into sales, marketing and product management positions at both large multinational corporations and small startups. He has published numerous articles about the need to address change and vulnerability management to achieve cyber resilience.