OpenAI Announces Update: Significantly Enhanced AI Models

On June 13, 2023 (Beijing time), OpenAI announced updates To improve artificial intelligence capabilities in the workplace, OpenAI announced a series of updates to its groundbreaking generative AI models GPT-3.5 Turbo and GPT-4.

Specific iterations,** including the revolutionary introduction of new function call capabilities, improved manipulability, expanded context for GPT-3.5 Turbo, and a revised pricing structure, aim to provide developers with an expanded toolbox for creating Sophisticated, high-performance AI applications to meet the complexities of the modern work environment. **

These iterations, including the introduction of new function call capabilities, improved bootability, expanded context for GPT-3.5 Turbo, and a revised pricing structure, aim to provide developers with an expanded toolbox for creating complex, high-performance AI applications that meet the complexities of the modern work environment.

**Previous summary: OpenAI drives everything? **

Note: Developers are not the only segment that will benefit from the latest improvements to OpenAI's GPT models: We've seen it - Microsoft partnered with OpenAI to bring AI models to developers enhanced with generative AI Popular products like Bing and Office, Snapchat launched its generative AI chatbot My AI using OpenAI's GPT model, Salesforce released its first generative AI CRM product, Einstein GPT, powered by OpenAI's most "advanced model" Support, Morgan Stanley announced a partnership with OpenAI, becoming one of the few wealth management firms with access to the latest GPT-4 model, HubSpot developed ChatSpot.ai based on OpenAI GPT-4, Stripe incorporated OpenAI GPT technology to help understand customers and reduce fraud

**So, what improvements has OpenAI made to GPT-3.5 Turbo and GPT-4? **

OpenAI announced updates to the GPT-3.5 Turbo and GPT-4 models, including the introduction of new function calls in the Chat Completions API, improved manipulability, extended context for GPT-3.5 Turbo, and further price reductions.

product update

  • Chat Completions API——API new function call capability
  • Update gpt-4 and gpt-3.5-turbo update, update more guide version
  • New 16k context version of gpt-3.5-turbo (vs. standard 4k version).
  • The cost of the latest embeddings model has been reduced by 75%.
  • The input token cost of gpt-3.5-turbo is reduced by 25%.
  • Officially announced the timetable for the elimination of gpt-3.5-turbo-0301 and gpt-4-0314 models
  • All models will follow the same standard data privacy and security guarantees launched by OpenAI on March 1, 2023, and customer API data will not be used for training.

Create a chatbot that answers questions by calling external tools,

Translate natural language queries into function calls, API calls or database queries,

Extract structured data from text

New API parameters provide developers with a way to describe functions to the model, and require the model to selectively call specific functions

**The introduction of function calls opens up new possibilities for developers to seamlessly integrate GPT models with other APIs or external tools. **

For example, a workplace application could use this capability to convert a user's natural language query into a function call to a CRM or ERP system, making the application more user-friendly and efficient. While OpenAI remains concerned about potential security issues associated with untrusted data, it recommends that developers only obtain information from trusted tools and protect their applications by including user confirmation steps before performing impactful actions .

function call

Developers can now describe functions in gpt-4-0613 and gpt-3.5-turbo-0613, and let the model intelligently choose to output a JSON object containing parameters to call the function. In this way, the capabilities of GPT and External tools and APIs are connected.

(It means that the original application side parses the AI output results and calls its own functions. Instead, I give AI all possible calls and let it choose by itself)

According to the developer's feedback and functional requirements, OpenAI empowers developers to describe the functions of the updated model, and let AI intelligently generate JSON objects containing these functional parameters, making the connection between GPT's capabilities and external tools and APIs more reliable. This enables better retrieval of structured data from models. The new function calls this block to support a variety of applications.

These models have been fine-tuned to both detect when a function needs to be called (depending on user input) and respond with a JSON file that conforms to the function's signature. Function calls allow developers to more reliably obtain structured data from models.

For example, developers can:

Create a chatbot that directly answers questions by calling external tools (such as the ChatGPT plugin): For example, a query such as "email Anya to see if she wants to drink coffee next Friday"

Converted to a function call like send_email(to: string, body: string),

or "What's the weather like in Boston?"

Convert to get_current_weather(location: string, unit: 'celsius' | 'fahrenheit')

(Author's Note: API is playing with plug-ins, and some of Langchain's capabilities are likely to be replaced)

Translate "Who are the top 10 customers this month?" into internal API calls like

get_customers_by_revenue(start_date: string, end_date: string, limit: int), or convert "How many orders did Acme, Inc. have last month?" directly into SQL statements, using sql_query (query: string).

Extract structured data from text:

Define a function called extract_people_data(people: [{name: string, birthday: string, location: string}] to extract all people mentioned in a Wikipedia article. These use cases are provided by OpenAI's /v1 The new API parameters functions and function_call in the /chat/completions endpoint allow developers to describe functions to the model through JSON Schema, and optionally ask it to call a specific function.

From the developer docs, add s if you see a situation where the function call could be improved.

model improvement

The new GPT-4 and GPT-3.5 Turbo models include improved bootstrapping and expanded context.

Developers can take advantage of increased bootstrappability to design AI applications that are more organizationally or task specific, such as generating more targeted business reports or creating detailed, context-aware chatbots in customer service. response.

The released GPT-3.5 Turbo-16k can provide four times the context length of the standard GPT-3.5 Turbo, supporting up to 20 pages of text in a single request. This expanded contextual capability enables AI to understand and generate responses to larger bodies of text.

For example, in legal or academic workplaces, where documents are often lengthy, this feature can greatly improve a model's ability to understand and summarize large volumes of text, making information extraction more efficient. Likewise, for project management applications, it could allow AI to process and understand the entire project plan at once, helping to generate more insightful project analysis and forecasts.

OpenAI also announced the deprecation of the earlier GPT-4 and GPT-3.5 Turbo versions, which will remain available until September 13. Developers are guaranteed a smooth transition and encouraged to provide feedback to help refine the process.

Lower the price

As the system becomes more efficient, OpenAI is passing the savings on to developers. 75% off the price of using the popular text-embedding-ada-002. Additionally, the cost of input tokens for the GPT-3.5 Turbo model has also been reduced by 25%.

Combined with improved functionality, the reduced cost will make it easier for developers to use and experiment with these models in their applications.

Continue to develop the GPT model

OpenAI is committed to continuously improving its platform based on developer feedback. With the latest enhancements to its generative AI models, OpenAI is opening new possibilities for developers to create innovative and improved AI applications for the workplace. New API updates and GPT models provide developers with additional capabilities to create AI applications better suited to handle the complexities and specificities of common tasks in workplace environments.

Other Interpretations & Speculations

On the 3rd of 2023, the author posted an article, just now! Interpretation of OpenAI launched ChatGPT Plugins

which mentions:

**1. For platform-based companies, the future advantages should be limited to their own data itself. The data is directly accessed through AI, and users skip the design of the engineering architecture in the middle. **

For example, if we want to imitate station B to develop a station C, traditionally, we need to look at the business structure of station B first, and then find an architect to clone it again. Open source code should be used, but , with the emergence of the ChatGPT Plugin, this form may become a thing of the past, and it may become a method called platform business leasing in the future to directly convert data to users in one step.

**2. Open AI, as the referee of the industry, will directly start the competition. Both players and referees. It all happened so fast. **

So, today I add some thoughts:

**The essence of Plugin is actually resource equality. OpenAI will become the king of big data in the future, not a specific service provider. The equalization of stock resources, I think, is the real core of this round of AI. When we think about problems, we have to think about the accumulation of technology, and we can't just think about problems from the perspective of traffic. **

View Original
The content is for reference only, not a solicitation or offer. No investment, tax, or legal advice provided. See Disclaimer for more risks disclosure.
  • Reward
  • Comment
  • Share
Comment
0/400
No comments