Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Internet solutions firm Cloudflare today unveiled Cloudflare One for AI, its latest suite of zero-trust security controls. The tools enable businesses to safely and securely use the latest generative AI tools while protecting intellectual property and customer data. The company believes that the suite’s features will offer a simple, fast and secure means for organizations to adopt generative AI without compromising performance or security.
“Cloudflare One provides teams of any size with the ability to use the best tools available on the internet without facing management headaches or performance challenges. In addition, it allows organizations to audit and review the AI tools their team members have started using,” Sam Rhea, VP of product at Cloudflare, told VentureBeat. “Security teams can then restrict usage only to approved tools and, within those that are approved, control and gate how data is shared with those tools using policies built around [their organization’s] sensitive and unique data.”
Cloudflare One for AI provides enterprises with comprehensive AI security through features including visibility and measurement of AI tool usage, prevention of data loss, and integration management.
Cloudflare Gateway allows organizations to keep track of the number of employees experimenting with AI services. This provides context for budgeting and enterprise licensing plans. Service tokens also give administrators a clear log of API requests and control over specific services that can access AI training data.
Event
Transform 2023
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
Register Now
Cloudflare Tunnel provides an encrypted outbound-only connection to Cloudflare’s network, while the data loss prevention (DLP) service offers a safeguard to close the human gap in how employees share data.
“AI holds incredible promise, but without proper guardrails, it can create significant business risks. Cloudflare’s zero trust products are the first to provide guardrails for AI tools, so businesses can take advantage of the opportunity AI unlocks while ensuring only the data they want to expose gets shared,” said Matthew Prince, co-founder and CEO of Cloudflare, in a written statement.
Mitigating generative AI risks through zero trust
Organizations are increasingly adopting generative AI technology to enhance productivity and innovation. But the technology also poses significant security risks. For example, major companies have banned popular generative AI chat apps because of sensitive data leaks. In a recent survey by KPMG US, 81% of US executives expressed cybersecurity concerns around generative AI, while 78% expressed concerns about data privacy.
According to Cloudflare’s Rhea, customers have expressed heightened concern about inputs to generative AI tools, fearing that individual users might inadvertently upload sensitive data. Organizations have also raised apprehensions about training these models, which poses a risk of granting overly broad access to datasets that should not leave the organization. By opening up data for these models to learn from, organizations may inadvertently compromise the security of their data.
“The top-of-mind concern for CISOs and CIOs of AI services is oversharing — the risk that individual users, understandably excited about the tools, will wind up accidentally leaking sensitive corporate data to those tools,” Rhea told VentureBeat. “Cloudflare One for AI gives those organizations a comprehensive filter, without slowing down users, to ensure that the shared data is permitted and the unauthorized use of unapproved tools is blocked.”
The company asserts that Cloudflare One for AI equips teams with the necessary tools to thwart such threats. For example, by scanning data that is being shared, Cloudflare One can prevent data from being uploaded to a service.
Furthermore, Cloudflare One facilitates the creation of secure pathways for sharing data with external services, which can log and filter how that data is accessed, thereby mitigating the risk of data breaches.
“Cloudflare One for AI gives companies the ability to control every single interaction their employees have with these tools or that these tools have with their sensitive data. Customers can start by cataloging what AI tools their employees use without effort by relying on our prebuilt analysis,” explained Rhea. “With just a few clicks, they can block or control which tools their team members use.”
The company claims that Cloudflare One for AI is the first to offer guardrails around AI tools, so organizations can benefit from AI while ensuring they share only the data they want to expose, not risking their intellectual property and customer data.
Keeping your data private
Cloudflare’s DLP service scans content as it leaves employee devices to detect potentially sensitive data during upload. Administrators can use pre-provided templates, such as social security or credit card numbers, or define sensitive data terms or expressions. When users attempt to upload data containing one or more examples of that type, Cloudflare’s network will block the action before the data reaches its destination.
“Customers can tell Cloudflare the types of data and intellectual property that they manage and [that] can never leave their organization, as Cloudflare will scan every interaction their corporate devices have with an AI service on the internet to filter and block that data from leaving their organization,” explained Rhea.
Rhea said that organizations are concerned about external services accessing all the data they provide when an AI model needs to connect to training data. They want to ensure that the AI model is the only service granted access to the data.
“Service tokens provide a kind of authentication model for automated systems in the same way that passwords and second factors provide validation for human users,” said Rhea. “Cloudflare’s network can create service tokens that can be provided to an external service, like an AI model, and then act like a bouncer checking every request to reach internal training data for the presence of that service token.”
What’s next for Cloudflare?
According to the company, Cloudflare’s cloud access security broker (CASB), a security enforcement point between a cloud service provider and its customers, will soon be able to scan the AI tools businesses use and detect misconfiguration and misuse. The company believes that its platform approach to security will enable businesses worldwide to adopt the productivity enhancements offered by evolving technology and new tools and plugins without creating bottlenecks. Additionally, the platform approach will ensure companies comply with the latest regulations.
“Cloudflare CASB scans the software-as-a-service (SaaS) applications where organizations store their data and complete some of their most critical business operations for potential misuse,” said Rhea. “As part of Cloudflare One for AI, we plan to create new integrations with popular AI tools to automatically scan for misuse or incorrectly configured defaults to help administrators trust that individual users are not accidentally creating open doors to their workspaces.”
He said that, like many organizations, Cloudflare anticipates learning how users will adopt these tools as they become more popular in the enterprise, and is prepared to adapt to challenges as they arise.
“One area where we have seen particular concern is the data retention of these tools in regions where data sovereignty obligations require more oversight,” said Rhea. “Cloudflare’s network of data centers in over 285 cities around the world gives us a unique advantage in helping customers control where their data is stored and how it transits to external destinations.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.
Credit: Source link