Have you ever clicked a link that you thought was perfectly safe, only to find out later that it wasn't? I've done that before -- but thankfully, I stopped before getting too far. However, the new CoPhish attack is even sneakier. CoPhish uses Microsoft's Copilot Studio (a tool used to create chatbots) to deceive individuals in less sophisticated ways and obtain OAuth tokens for their logins. I will explain how it works and how you can protect yourself.
• Because these pages are hosted on a legitimate Microsoft domain (copilotstudio.microsoft.com), they seem trustworthy.
• As you log in or “allow” access, the bot captures your OAuth token, which is your digital key to act as you.
• Then, the bot secretly sends the token to the hacker’s server.
• Once the hacker has the token, they can use it to access your email, files, or even company data through Microsoft applications without ever needing to know your password again.
• After you click the "Allow" button, nothing seems out of the ordinary.
• You won't even be aware that your token has been compromised — this takes place entirely in the background.
• Tokens are like extra keys: If someone steals your token, they can walk directly into your account.
• Admins are prime targets: If a hacker gets into your admin token, they can fully access entire systems.
• Fake trust: Because it appears to be a Microsoft page, users will trust the prompt by default.
• New twist, same goal: This attack is just phishing with a modern twist — using legitimate services to steal your sensitive information.
• Add additional security to app permissions: Don't allow a regular user to approve any new apps unless it is absolutely necessary
• Watch for new Copilot agents: Regularly check to see if anyone has created or modified topic(s) for a chatbot; in particular, any topic titled "Login"
• Keep admin roles to a minimum: Limit admin access to only those who need its capabilities.
• Regularly train your team: Remind them that even links from Microsoft can potentially be malicious
• Audit Copilot use: If you or your organization doesn't use the sharing/preview features, it should be turned off
If you have a Microsoft account or use Copilot Studio, it's time to review your account settings and improve your security, as the most dangerous attacks are often the ones that don't seem dangerous at all.
CoPhish Attack Explained: How It Works
• Hackers use Copilot Studio “agents” (chatbots built directly into Microsoft’s Copilot Studio for free) to create a fake login page.• Because these pages are hosted on a legitimate Microsoft domain (copilotstudio.microsoft.com), they seem trustworthy.
• As you log in or “allow” access, the bot captures your OAuth token, which is your digital key to act as you.
• Then, the bot secretly sends the token to the hacker’s server.
• Once the hacker has the token, they can use it to access your email, files, or even company data through Microsoft applications without ever needing to know your password again.
Why It’s Difficult to Detect
• The attack relies on authentic Microsoft domains, so there is nothing to be suspicious about.• After you click the "Allow" button, nothing seems out of the ordinary.
• You won't even be aware that your token has been compromised — this takes place entirely in the background.
What This Means
Think you, or your team, couldn't fall victim to a phishing scam? Even the smartest users can fall for this one. Here's why:• Tokens are like extra keys: If someone steals your token, they can walk directly into your account.
• Admins are prime targets: If a hacker gets into your admin token, they can fully access entire systems.
• Fake trust: Because it appears to be a Microsoft page, users will trust the prompt by default.
• New twist, same goal: This attack is just phishing with a modern twist — using legitimate services to steal your sensitive information.
How to Protect Yourself
Here’s how to protect yourself:• Add additional security to app permissions: Don't allow a regular user to approve any new apps unless it is absolutely necessary
• Watch for new Copilot agents: Regularly check to see if anyone has created or modified topic(s) for a chatbot; in particular, any topic titled "Login"
• Keep admin roles to a minimum: Limit admin access to only those who need its capabilities.
• Regularly train your team: Remind them that even links from Microsoft can potentially be malicious
• Audit Copilot use: If you or your organization doesn't use the sharing/preview features, it should be turned off
Final Thoughts
The CoPhish attack demonstrates that even when using trusted platforms we can be attacked by hackers. It's not just sketchy emails anymore; these hackers wanted to use legitimate company applications.If you have a Microsoft account or use Copilot Studio, it's time to review your account settings and improve your security, as the most dangerous attacks are often the ones that don't seem dangerous at all.