top of page
All Stories

Top 10 technology trends for 2023

Updated: Jan 16, 2023




Need to produce content for your blog or website? There are various AI apps on the market that can create interesting articles after you input certain keywords and decide on the tone, whether you want to come across as casual, informative, or knowledgeable, among others.


This generative AI technology is seen to become an inclusive technology that can significantly enhance the variety, creativity, and efficiency of content creation this year, according to the Alibaba DAMO Academy.


It leads the technology trends seen shaping many industries in 2023.


“The wide application of technologies will facilitate the rollout of AI and other digital technologies in vertical markets and promote the collaboration of public and private sectors and individuals in security technology and security management,” Jeff Zhang, head of DAMO, said in a statement sent to Ventures Cebu.


DAMO analyzed public papers and patent filings over the past three years and conducted interviews with almost 100 scientists, entrepreneurs and engineers worldwide to develop its list of top technology trends for 2023. Read on to learn more about it.


1. Generative AI



Generative AI generates new content based on a given set of text, images, or audio files. Currently, Generative AI is mainly used to produce prototypes and drafts and applied in gaming, advertising, and graphic design scenarios. Along with future technological advancement and cost reduction, Generative AI will become an inclusive technology that can significantly enhance the variety, creativity, and efficiency of content creation.


In the next three years, Zhang said business models will emerge and ecosystems will mature as Generative AI becomes widely marketized. Generative AI models will be more interactive, secure, and intelligent, assisting human beings in completing various creative work.


2. Dual-engine Decision Intelligence



In the past, the traditional decision-making method is based on Operations Research. Due to its limitations in handling problems with great uncertainty and its slow response to large-scale problems, academia and industry began to include machine learning in decision optimization. The two engines are perfect complements to each other, and when used in tandem, can improve the speed and quality of decision making.


This technology is expected to be widely used in various scenarios to support dynamic, comprehensive, and real-time resource allocation, such as real-time electricity dispatching, optimization of port throughput, assignment of airport stands, and improvement of manufacturing processes.


Dual-engine decision intelligence will also increase the number of entities, expand the scale in regional resource allocation scenarios, and eventually achieve dynamic, comprehensive, and real-time resource allocation.


3. Cloud-native Security



Cloud-native security is implemented to not only deliver security capabilities native to cloud infrastructure but also improve security services by leveraging cloud-native technologies. Security technologies and cloud computing are becoming more integrated than ever before.


"We have witnessed applied technologies evolve from containerized deployment to microservices and then to the serverless model, and security services embraced the shift to become native, fine-grained, platform-oriented, and intelligent," Zhang said.


In the next three to five years, cloud-native security will become more versatile and can adapt more easily to multi-cloud architectures. It will also become more conducive to building security systems that are dynamic, end-to-end, precise, and applicable to hybrid environments.


4. Pre-trained Multimodal Foundation Models



Pre-trained multimodal foundation models have become a new paradigm and infrastructure for building artificial intelligence (AI) systems. These models can acquire knowledge from different modalities and present the knowledge based on a unified representation learning framework. In the future, foundation models are set to serve as the basic infrastructure of AI systems across tasks of images, text, and audio, empowering AI systems with cognitive intelligence capabilities to reason, answer questions, summarize, and create.


5. Hardware-Software Integrated Cloud Computing Architecture



Cloud computing is evolving toward a new architecture centered around Cloud Infrastructure Processor (CIPU). This software-defined, hardware-accelerated architecture helps accelerate cloud applications while maintaining high elasticity and agility for cloud application development. CIPU will become the de facto standard of next-generation cloud computing and bring new development opportunities for core software R&D and dedicated chip design, Zhang said.

6. Predictable Fabric based on Edge-Cloud Synergy


Predictable fabric, a host-network co-design networking system driven by advances in cloud computing, and aims to offer high-performance network services. It is also an inevitable trend as today's computing and networking capabilities gradually converge on each other. Through the full-stack innovation of cloud-defined protocols, software, chips, hardware, architecture, and platforms, predictable fabric is expected to subvert the traditional TCP-based network architecture and becomes part of the core network in next-generation data centers. Advances in this area also drive the adoption of predictable fabric from data center networks to wide-area cloud backbone networks.


7. Computational Imaging



Computational imaging is an emerging interdisciplinary technology. In contrast with traditional imaging techniques, computational imaging uses mathematical models and signal processing capabilities and thus can perform an unprecedented in-depth analysis of light field information.


This technology is already used on a large scale in mobile phone photography, health care, and autonomous driving. In the future, computational imaging will continue to revolutionize traditional imaging technologies and give rise to innovative and imaginative applications such as lensless imaging and Non-line-of-sight (NLOS) imaging.


8. Chiplet



Chiplet-based design allows manufacturers to break down a system on a chip (SoC) into multiple chiplets, manufacture the chiplets separately by using different processes, and finally integrate them into an SoC through interconnects and packaging. The interconnect standards of chiplets are being unified into a single standard, accelerating the industrialization process of chiplets. Powered by advanced packaging technologies, chiplets may bring in a new wave of change to the R&D process of integrated circuits and reshape the landscape of the chip industry.


9. PIM



Processing in Memory (PIM) technology integrates a CPU and memory on a single chip, which allows data to be directly processed in memory. In the future, compute-in-memory chips will be used in more powerful applications such as cloud-based inference. This will shift the traditional computing-centric architecture towards the data-centric architecture, positively impacting industries such as cloud computing, AI, and the Internet of Things (IoT).


10. Large-scale Urban Digital Twins



The concept of the urban digital twins has become a new approach to refined city governance. So far, large-scale urban digital twins have made major progress in traffic governance, natural disaster prevention and management, carbon peaking, and neutrality. In the future, large-scale urban digital twins will become more autonomous and multi-dimensional. (Images from Alibaba DAMO Academy)



Subscribe to our weekly newsletter
bottom of page