Large-Scale Pre-Training Models
Self-supervised large-scale models trained on massive amounts of data can handle a range of AI tasks using one unified model and paradigm. This technology has overcome the bottleneck of conventional technologies by reducing dependence on large-labeled data, significantly boosting the effectiveness, universality and generalization of AI models.
R&D direction is expected to shift in 2022 from increasing the model size to practical deployment. Large-scale models will continue to advance in terms of performance, universality, generalization, operating efficiency and cost-effectiveness, complemented by technologies such as cross-modal unified modeling, prompt learning, continuous learning, model distillation and sparse modeling. At the same time, the threshold for realizing real-world AI scenarios such as smart offices and smart finance will decrease.
AI for Science
Last year, machine learning helped mathematicians issue two major conjectures. A combination of machine learning, multi-scale modeling and high-performance computing solved real-time simulation problems of ultra-large random quantum circuits. AI has also demonstrated great potential in scientific research, particularly for data processing, establishing new experiments and creating more efficient computational models.
In the next few years, AI will be integrated into many different fields such as mathematics, physics, chemistry, materials and engineering, also playing a greater role in the progression of fundamental science.
AI-Powered Computational Biology
The COVID-19 pandemic has sparked an ever-growing demand for AI in the life science industry. For example, AI can be used to improve the accuracy and speed of on-target genome editing or to predict protein folding structures.
AI-powered computational biology is poised for more breakthroughs in fundamental research and applications, ranging from protein-based drug design, drug compounding, and drug screening to mRNA-based monoclonal antibodies, cancer therapy and other immunotherapies. The confluence of AI and computational biology will significantly accelerate drug development, as well as reduce the cost and facilitate precision medicine and personal therapies.
Privacy-related computing technologies, such as trusted confidential computing and federated computing, have risen to prominence because they address data security protection, data sharing and circulation from a technical perspective. With the improvement of performance in privacy computing technology, mutual promotion of technology and compliance standards and multi-party collaboration to enhance technical credibility, best practices for relevant applications can be developed in scenarios such as computational biology, financial analysis and data transactions.
In the long run, privacy computing technology may push encrypted data circulation and computing by default, gradually establishing an infrastructure for user confidence.
Integration of Quantum Hardware and Software
In 2022, quantum chips’ design, preparation, measurement and control technologies are expected to improve continuously. The number of qubits (“quantum bits”) will increase in scale. More breakthroughs will be made by reducing or adapting to noise. Quantum software and services will develop in a cross-platform direction, with users receiving more abundant quantum back-end options on cloud-native quantum computing platforms. Quantum computing platforms that carry quantum software-hardware integration solutions will gradually show their commercial potential.
With the deep integration and innovation of quantum computing and intelligent manufacturing, artificial intelligence, chemical medicine, fintech and other fields, numerous practical application solutions with significant quantum advantages will emerge.
Technological advances and policy regulation will bring driverless autonomous driving closer to reality in 2022. Autonomous driving permeates a wide range of usage scenarios, such as passenger vehicles, public transit, road freight, warehouse distributions, retail, sanitation and special operations in mines and ports, creating remarkable value and further promoting social progress.
Deep Space Exploration
Deep space exploration is the ultimate embodiment of humanity’s curiosity towards the universe. In the field of construction machinery automation, 24-hour continuous unmanned excavation has been achieved. The autonomous environment perception and motion planning algorithms used in rovers can also enable sensors with functions such as autonomous obstacle avoidance and decision-making, as well as flexible autonomous operation of a robotic arm.
In addition, AI technology is expected to play a key supporting role in the detection and repair of spacecraft damage, the construction of digital twin simulation laboratories and the detection and analysis of deep space big data.
The development of digital and intelligent technology has given us an opportunity to narrow social distance, accelerating the symbiosis between people, digital avatars and robots. This change is supported by the continuous progress of AI technologies such as vision, speech and natural language processing, as well as XR in cross-modal understanding and continuous learning, backed by the integration of hardware, networks, computing, ecosystem platform content and many other fields.
With the accelerating integration and innovation of various advanced information technologies, more platforms involving the combination of virtual, real and intelligent interactions for various industry and consumer scenarios will emerge, enhancing the deep integration of the digital economy and the real economy and enriching people’s work and life experience.
As AI technology accelerates and integrates innovation with various industries, data centers and large-scale AI computing are generating important social value, but at the same time posing challenges in the form of energy costs and environmental impact.
In the next few years, “green AI”-related technologies will continue to flourish, building systems around energy-efficient architecture design, training and reasoning strategies and data utilization to form evaluation benchmarks that consider both performance and energy consumption. It is also expected that more AI processors with higher computing power and lower energy consumption will continue to be invented. Leading AI companies will build large-scale intensive models to improve downstream performance and reduce overall energy consumption costs; policies will encourage the construction of green and low-carbon data centers and AI technology will be applied to improve infrastructure energy efficiency ratios and other measures.
Open-source platforms centered on deep learning frameworks have greatly lowered the development threshold for AI technology. Public datasets, large model bases and regional intelligent computing centers will be further developed to help small and medium-sized enterprises achieve cost reduction, increase efficiency and stimulate innovation. A national AI training system will also be gradually built to promote the reemployment of traditional industry personnel through AI science education.
The development of AI should benefit all groups in society. As AI service providers pay more attention to the needs of minority groups such as the elderly and children, they will develop inclusive AI services and products that allow everyone to enjoy the digital world.