The GenAI models must answer questions on sensitive political topics and President Xi Jinping.

Artificial intelligence (AI) companies in China are being tested by the government to see if their large language models (LLMs) “embody core socialist values,” according to a report. 

Both start-ups and large tech companies such as TikTok owner ByteDance and Alibaba will be reviewed by the government’s chief internet regulator, the Cyberspace Administration of China (CAC), according to the Financial Times (FT).

CAC officials will test the AI models for their responses to questions that relate to political topics and President Xi Jinping, among others.

The regulations have led China’s most popular chatbots to decline questions on topics such as the 1989 Tiananmen Square protests. 

Countries are trying to set a blueprint for AI regulation and China was among the first to set rules to govern generative AI (GenAI), which included requirements such as adhering to  “core values of socialism”.

One AI company in China told the FT that their model did not pass the first round of testing for reasons that were not clear but did after “guessing and adjusting,” the model. 

The report said that “security filtering,” which means removing “problematic information” from AI model training data and then adding a database of sensitive words, is how to meet the censorship policy requirements. 

The data training sets are more problematic to meet the rules as most LLMs are trained on English language data, engineers told the FT. 

The CAC is trying to strike a balance between making China a competitive AI leader and meeting the government’s socialist beliefs.

GenAI services need a license to operate and if found to provide “illegal” content, it must take measures to stop generating such content and report it to the relevant authority, CAC said last year. 

Despite the regulations, China has applied for the most GenAI patents in the world with ByteDance and start-up Zhipu developing their own generative AI chatbots. 

Leave a Reply

Your email address will not be published. Required fields are marked *