Immediately after switching the page, it will work with CSR.
Please reload your browser to see how it works.
We haven't put a lot of marketing into the service, maybe we should do that.
They discuss this on the Enterprise page:
Protect your data with enterprise-grade privacy, security, and deployment tools You own and control your business data in ChatGPT Enterprise. We do not train on your business data or conversations, and our models don’t learn from your usage. ChatGPT Enterprise is also SOC 2 compliant and all conversations are encrypted in transit and at rest. Our new admin console lets you manage team members easily and offers domain verification, SSO, and usage insights, allowing for large-scale deployment into enterprise. See our privacy page and our Trust Portal (opens in a new window) for more details on how we treat your data.
https://openai.com/index/introducing-chatgpt-enterprise/
So it’s very likely that most companies find this reassuring enough and therefore don’t necessarily care too much about running models locally. Anyone that needs security greater than this probably has the resources to develop AI capabilities in-house.
I’m a product marketing guy for Nutanix. We have a new LLM deployment solution that can run on any CNCF Kubernetes where you can deploy LLMs from a validated list from Hugging Face or NVIDIA NIM, or upload your own.
You can then create RBAC-controlled endpoints to connect your GenAI apps up to all with a point and click interface.
Check it out with basic walkthrough on the product page: https://www.nutanix.com/products/nutanix-enterprise-ai
IMHO, local LLMs still can't compete on quality and speed.
https://azure.microsoft.com/en-us/products/ai-services/opena...
https://openai.com/index/introducing-chatgpt-enterprise/