Cloudflare has extended its single-vendor SASE platform, Cloudflare One, to generative artificial intelligence (AI) services. Cloudflare One for AI, a suite of Zero Trust security controls, will enable enterprises to safely and securely use the latest generative AI tools without putting intellectual property and customer data at risk.
With every transformative step forward in technology, from mobile phones to cloud computing, there are new security threats that rise to the surface. Major companies have banned the use of popular generative AI chat apps because of sensitive data leaks, and Italy instituted a temporary ban on generative AI tools for inadequate user data protections. According to a KPMG survey on generative AI, AI is expected to have an enormous impact on business, but the majority of US executives surveyed are years away from implementing it; cyber security (81%) and data privacy (78%) are the most top of mind concerns for leaders. CISOs and CIOs need to strike a balance between enabling transformative innovation through AI and still maintaining compliance with data privacy regulations. Whether it’s an employee experimenting with AI, or a company initiative, once proprietary data is exposed to AI, there is no way to reverse it.
“AI holds incredible promise, but without proper guardrails it can create significant risks for businesses. It is far too easy, by default, to upload sensitive internal or customer data to AI tools. Once the data is used for training AI, it is virtually impossible to get it out,” explained Matthew Prince, co-founder and CEO of Cloudflare. “If you were going to let a class of university students rummage around in your internal data, you’d of course put clear rules in place on what data they can access and how it can be used in their education. Cloudflare’s Zero Trust products are the first to provide the guard rails for AI tools, so businesses can take advantage of the opportunity AI unlocks while ensuring only the data you want to expose gets shared.”
Cloudflare One for AI provides a simple, fast, and secure way for companies to safely build using the latest generative AI technologies, without compromising security or performance. With Cloudflare One, companies can gain visibility into and measure AI tool usage, prevent data loss, and manage integrations:
- Cloudflare Gateway helps companies observe how many employees are experimenting with AI services, and adds context when planning for budgets and enterprise licensing.
- Service tokens give administrators a clear log of API requests, control over the specific services that can access AI training data, and the ability to revoke tokens with a single click when building ChatGPT plugins for internal and external use.
- Cloudflare Tunnel provides an encrypted, outbound-only connection to Cloudflare’s network. Every request will be checked against the access rules configured for services protected by Cloudflare One or when teams are ready to allow an AI service to connect to their infrastructure.
- Cloudflare’s Data Loss Prevention (DLP) service provides a safeguard to close the human gap in how employees may share data. Simple pre-configured options can check for data that looks like social security numbers or credit card numbers, and custom scans can look for patterns based on data configurations for a specific team. More granular rules can even allow select users to experiment with projects containing sensitive data, with stronger limitations on the majority of teams and employees.
- Cloudflare’s cloud access security broker (CASB) service gives comprehensive visibility and control over SaaS apps. Soon, Cloudflare CASB will be able to scan the AI tools that your team uses to detect misconfiguration and misuse.
Generative AI is an exciting technology with the promise to transform how we work. As this technology evolves and new tools and plugins are developed, Cloudflare’s platform approach to security will ensure that enterprises everywhere can embrace these productivity enhancements without creating bottlenecks and ensure compliance with the latest regulations.