Workshop's AI policy and security protocol

Azure OpenAI

We utilize Microsoft's Azure OpenAI service at Workshop to host AI tools to support the Smart Assist features in our platform. After Microsoft's investment in OpenAI in January 2023, Microsoft released the OpenAI models inside the Azure OpenAI service. This made AI tools like ChatGPT and Davinci available through the Azure API. Azure OpenAPI and OpenAI's own API are co-developed, meaning they operate largely the same and as new models, parameters or features are added to one they will also be added to the other. There are many benefits to using the Azure API over OpenAI's API. 

Security & Privacy

One of the reasons OPenAI released their ChatGPT application for free to the public when they first launched was to collect a vast amount of data about how their model behaved. OpenAI was making constant improvements to the model with the use of this data post-release. This is great for the development of GPT and other Large Language Models, but it also calls into question the security and privacy of any data that you feed into the model as a prompt. Anything you type into ChatGPT will be saved and potentially used to train other AI models. 

Microsoft has positioned its Azure OpenAI Service in contrast to this because while it utilized the trained models from OpenAI, any data sent to an Azure OpenAI model is self contained within that specific deployment of the model and not used to train the model. Each company that has access to Azure OpenAI Service needs to set up its own deployment of an AI model to be able to use it. This means that Workshop has its own deployment of GPT-3.5-turbo (ChatGPT) that is self-contained, with its own URL, API keys, and configuration. Simply put, have our own private deployment of ChatGPT and any data that is collected is private and contained within our Azure environment, not accessible by Microsoft or OpenAI, and not used in training future models. You can read more about Microsoft's commitment to Responsible AI here. 

Availability & Stability 

Since its launch, OpenAI's API has been notoriously unstable. It frequently is down or throttled. By using Azure OpenAPI Service, we get the stability of Azure services and the control over important things like region access and rate limiting. 

Customized Solutions

When Workshop deploys models within Azure OpenAI, we have the ability to tailor those models to our specific use case. While we don't currently utilize this feature, we are considering these options to provide better experiences for our customers in the future. 

Azure OpenAI provides us the ability to add additional model training data, example prompts and responses, and custom content filters to customize the model to our needs. Rather than a stock model with limited or no customization offered by the OpenAI API, with Azure we can further refine and improve our AI tools over time to deliver a better customer experience.