Written by Zhang Yu Edited by Yang Bocheng Titled by Tuchong Creative On January 16, Zhipu AI, a developer of AI knowledge intelligence technology, held its first technology open day (Zhipu DevDay), fully demonstrating its more than three years of involvement in the large-scale m

entertainment 4255℃
Written by

Written by Zhang Yu Edited by Yang Bocheng Titled by Tuchong Creative On January 16, Zhipu AI, a developer of AI knowledge intelligence technology, held its first technology open day (Zhipu DevDay), fully demonstrating its more than three years of involvement in the large-scale m - Lujuba

| Editor by Zhang Yu

| Title picture by Yang Bocheng

| Tuchong creative

html On January 16, Zhipu AI, a developer of AI knowledge and intelligence technology, held its first technology open day (zhipu devday), fully demonstrating its commitment to the large model business. The technical achievements accumulated over the years have been released, and a new generation of large base model glm-4 has been released. Zhang Peng, CEO of

Zhipu AI, said that the overall performance of glm-4 has been greatly improved compared to the previous generation, approaching gpt-4, can support longer contexts, and has stronger multi-modal capabilities. At the same time, glm-4 has faster inference speed, supports higher concurrency, and greatly reduces inference costs.

Written by Zhang Yu Edited by Yang Bocheng Titled by Tuchong Creative On January 16, Zhipu AI, a developer of AI knowledge intelligence technology, held its first technology open day (Zhipu DevDay), fully demonstrating its more than three years of involvement in the large-scale m - Lujuba

Picture source: Zhipu ai

In addition, glm-4 has greatly improved the capabilities of intelligent agents. glm-4 all tools can automatically understand and plan complex instructions independently according to user intentions, and freely call web browsers and code interpreters for code interpretation. and multimodal venison graphs for large models to accomplish complex tasks. The glms personalized agent customization function is also online at the same time. Users can create their own glm agents with simple prompt word instructions, which greatly lowers the threshold for using large models.

"catching up with open ai" and "benchmarking open ai has been the goal of Zhipu AI since its establishment" are the words Zhang Peng mentioned many times when sharing with the outside world. At present, the competition of large models is no longer from 0 to 1. Rather than a battle for nothing, it is a battle for implementation. Various large model companies have begun to fight hand-to-hand. Can Zhipu AI, known as "China Openai", successfully break through the large model competition?

1. It is temporarily difficult to benchmark against openai

As the largest open source large model in China, Zhipu AI has a strong technical architecture, but there is still some distance from benchmarking against openai.

Zhang Peng said frankly that compared with foreign large models, the development of domestic large models started later. Coupled with the limitations of high-performance computing power and the gap in data quality, domestic large models are on par with the world's advanced level in terms of scale and core capabilities. There is a certain gap, which is about one year. From the technical perspective of

, openai pays more attention to versatility, portability and scalability. Its gpt series models can be applied in multiple scenarios and are highly customizable. In contrast, Zhipu AI's technical route is "large model + small model", through pre-training and fine-tuning of large models to adapt to the needs of different scenarios and tasks. This technical route can improve the generalization ability and application scope of the model, but it also has problems such as high model complexity, large amount of calculation, and long training time.

In terms of model scale, openai's gpt series models are larger in scale and can handle a large amount of natural language data, thereby obtaining better model performance. In contrast, Zhipu AI’s model size may be smaller and its ability to process data is limited, which may affect its model performance and generalization capabilities. In terms of data resources, openai has a large number of natural language data resources that can be used to train and optimize its models. In contrast, Zhipu AI may have relatively few data resources, resulting in limitations in the effect and performance of its model training.

This means that if we want to make up the gap with openai as soon as possible, Zhipu AI must continue to improve its large model capabilities, and training parameters naturally need to be improved, but the other side of the coin is that Zhipu AI will also face difficulties in terms of funding. facing a huge problem.

First of all, hardware is a huge investment. According to estimates by the US market research organization trendforce, 20,000 GPU chips are needed to process chatgpt training data. As openai further expands the commercial application of chatgpt and other gpt models, its GPU demand will Breaking through 30,000 (this report is based on a100 chips).

In addition, the cost of training large models cannot be underestimated. According to the estimate of "How much computing power does chatgpt require" released by Guosheng Securities, the cost of gpt-3 training once is about US$1.4 million. For some larger llm (large language model) ), with training costs ranging from $2 million to $12 million.

Where the huge funds come from and how much value they can be exchanged for are unknown to Zhipu AI.

However, fortunately, Zhipu AI is favored by investment institutions. In October 2023, Zhipu AI stated that it had successfully raised more than 2.5 billion yuan during the year.This important financing milestone has been actively supported by a number of well-known institutions. The main participants include the Social Security Fund, Zhongguancun Independent Innovation Fund, as well as Meituan, Ant, Alibaba, Tencent, Xiaomi, Kingsoft, Shunwei Capital, Boss Direct Pin, TAL, Sequoia, Hillhouse and many other institutions also participated in the investment, including some old shareholders.

Written by Zhang Yu Edited by Yang Bocheng Titled by Tuchong Creative On January 16, Zhipu AI, a developer of AI knowledge intelligence technology, held its first technology open day (Zhipu DevDay), fully demonstrating its more than three years of involvement in the large-scale m - Lujuba

Picture source: Tianyancha

Especially since 2023, Zhipu AI has won 5 rounds of financing in succession, with a valuation of more than 10 billion yuan, becoming a "unicorn" company in the domestic AI field. Zhipu AI stated that the financing will be used to further promote the research and development of its large-scale base model to better support a wide range of industry ecology and promote the vision of rapid growth with partners.

2. Commercialization is difficult to overcome.

Commercialization is the most direct way to verify the value of a new technology. At this stage, domestic large-scale models are flourishing, but most of them are still in the stage of focusing on technology and development, and are basically in the exploratory stage for commercialization.

Overall, the profit methods of large model companies mainly include large models, large models + computing power, and large models + applications. Among them, large models and large models + computing power are the main profit methods. The profit method of

Zhipu AI is basically the same as that of the industry. First, it provides customized development services for large models according to customer needs. The highest prices for cloud privatization and local privatization are 1.2 million yuan/year and 36.9 million yuan/year respectively; It is a standard version of a large model that provides API access. It is charged according to the use of tokens. The charging standards for chatglm-turbo, characterglm, and text-embedding are 0.005 yuan/thousand tokens, 0.015 yuan/thousand tokens, and 0.005 yuan/thousand tokens respectively. At present, the commercialization of Zhipu AI is mainly aimed at b-end users of enterprises and institutions.

As a comparison, openai's commercialization is also divided into two parts: c-side and b-side. Specifically, for the c-side market, openai launched the chatgpt plus subscription plan, which charges $20 per month. Compared with the free version, even in Users can also access chatgpt normally during peak hours, with faster response time and priority in using new features. For the b-side market, openai has released the chatgpt API. Developers can integrate chatgpt into products to more efficiently bring out value.

It is worth noting that openai also launched an enterprise service version in August 2023, which is expected to bring openai 65 million yuan in revenue every month. Throughout 2023, the multiple payment plans launched by openai have brought in more than 11 billion yuan in revenue.

A common phenomenon is that domestic large-scale model products, including Zhipu AI, are difficult to bring stable profits despite high investment. Although the road to commercialization of large models is long and arduous, there is hope. 360's 2023 semi-annual report shows that the "360 Intelligent Brain" large model has begun to generate revenue, with an amount of nearly 20 million yuan; SenseTime Group also announced that generative AI-related revenue increased by 670% in the first half of 2023. A related report by

Ai Analysis pointed out that the industries currently accelerating the commercialization of large models are energy and finance. The reason is that these two industries are densely distributed with central state-owned enterprises. Central state-owned enterprises have complete data infrastructure construction, high investment in computing power, many AI application scenarios and strong foundation. These reasons promote the rapid integration of central state-owned enterprises and large models.

For Zhipu AI, the path to commercialization of large models is relatively clear at present, but whether it can take the road of commercialization of large models depends not only on the exploration of business models, but also on solving the underlying problems of the development of large models. .

3. Where will the domestic large models go?

According to the latest data from the CCID Research Institute of the Ministry of Industry and Information Technology, the scale of my country's large language model market will increase rapidly in 2023, and application scenarios will continue to enrich. It is expected that my country's large model market will reach 13.23 billion yuan in 2023, with a growth rate of 110%.

In the future, with the continuous iterative progress of technology, large models will open up new application scenarios in fields such as embodied intelligence and autonomous driving.

Specifically, there will be three major trends in the development of large models: The first is the improvement of the intelligent capabilities of large models. In the future, large models will have higher accuracy, stronger understanding capabilities, and wider applicability, which means that large models It can better understand natural language and perform more complex tasks, such as global control, creation, etc. Secondly, large models will be used in more fields. In addition to traditional text processing, large models will also be used in speech processing. Recognition, image generation, video understanding, etc. will play a greater role, and users can enjoy the convenience brought by AI in more scenarios. In addition, large models will be more customized and can better meet the individual needs of users. Users can choose the appropriate model according to their actual needs and perform customized configurations. The

large model is developing rapidly, but it does not mean disorderly development. In the future, the supervision of large models will become more stringent. In July 2023, seven departments including the Cyberspace Administration of China issued the "Interim Management Measures for Generative Artificial Intelligence Services", which clearly stipulates that large model products must pay attention to data privacy and security, and cannot illegally obtain or Disclosure and use of personal information, privacy, and business secrets must not infringe on intellectual property rights; the content generated by the large model should reflect the core values ​​of socialism and cannot generate discriminatory content, etc.

Zhang Peng is very optimistic about the future of large models, "In 2024, the large model market will return to calmness from its barbaric growth. Investment and hype in large models will come to an end, and the industry's focus will shift from the model itself to finding applications. However, this does not mean that the technological evolution speed of large models will slow down, and the ceiling for upward exploration is far from being reached."

Tags: entertainment