Brandon Wang is vice president. synopsis.
The environment surrounding artificial intelligence is slowly undergoing changes as big as the industrial revolution and the Internet boom. While discriminative AI, including machine learning and deep learning, is reshaping traditional industries, GenAI is beginning to push the boundaries of human creativity. Combine the rapid advances in AI hardware with the potential of emerging technologies such as quantum AI, and it’s clear that we’re only at the beginning of an ongoing technological revolution.
AI is relevant to every industry, including the chip design industry. Today, AI-driven electronic design automation (EDA) solutions are delivering more than 10% improvements in performance, power, and area (PPA), up to 10x faster turnaround times, and double-digit improvements in verification coverage. benefits can be realized. In the healthcare industry, AI algorithms can analyze medical images faster and more accurately than human experts, and AI-based drug discovery can significantly reduce the time to market for new treatments.
The Rise of AI Agents: A New Frontier
One of the most exciting technological developments in the field of AI is the emergence of AI agents. They function similar to automated assembly lines, breaking down large tasks into smaller tasks and using AI to perform each task more efficiently. AI agents can be categorized into five levels along a spectrum of autonomy, from simple reactive L1 agents to fully autonomous L5 agents.
While current agent applications primarily operate at L2 or L3, the potential for L4 strategic decision-making and adaptive L5 agents has the potential to revolutionize fields from robotics to healthcare. L4-level agents are expected to be widely implemented in specialties such as autonomous robotic surgery by 2035. On the other hand, applications of L5 AI agents, including fully autonomous surgery and personalized medicine, may not become widespread until after 2050.
Among the benefits of these agents are the promise of increased productivity and the potential to compensate for staffing shortages. However, AI agents also come with significant challenges. The main hurdle is error compounding, which is a serious concern in AI systems performing complex tasks. Therefore, higher levels of agent autonomy require increased accuracy to deal with errors that can accumulate in multi-step processes. Other challenges include contextual memory limitations, where vector databases can cause hallucinations, and task planning challenges due to heavy reliance on prompts, which limits scalability.
Transforming the job market
The rapid advancement of AI has raised legitimate concerns about its impact on the labor market. However, history shows that technological disruption usually increases GDP through new demands and uses and creates overall jobs.
A good example is the semiconductor industry. The semiconductor industry has limited resources to begin with, both in terms of capital investment and human resources. In particular, integrated circuit (IC) designers represent a relatively small talent pool, with an estimated workforce of tens of thousands worldwide, compared to the 28.7 million software developers and 28.7 million software developers worldwide by 2024. That’s significantly less than the 73 million IT professionals. The demand for IC design work consistently outstrips the available talent, so improving productivity through AI can help close this gap. For example, if 100 IC designers are typically required to meet current demand, AI can enable 70 designers to achieve the same results, filling or filling labor shortages. It can free up resources to tackle additional demands that are not available. Design automation tools with built-in AI capabilities are proven to improve designer productivity.
McKinsey insights show that the semiconductor industry will need to make significant talent investments to meet AI’s growing computing demands, with a focus on specialized areas such as AI accelerators and high-performance GPUs. has been. This demand could create tens of thousands of new jobs around the world over the next decade.
As AI innovates, the list of new jobs such as AI agent developers, ethicists, and prompt engineers will continue to grow. In the future, AI personality designers will create brand-specific AI personalities, AI trainers will enrich models with high-quality data and tweaks, and AI system auditors will be able to assess bias and regulatory compliance. .
On the other hand, AI brings new skill requirements as well as technical skills such as machine learning and data science. It also requires human-centered skills such as creative problem-solving, critical thinking, and emotional intelligence, which won’t be replaced by AI anytime soon.
Introducing AI – is it more of a defensive strategy?
The stakes are high for companies considering AI adoption, and the competitive risks of slow adoption make AI a more defensive strategy. Businesses need to analyze how AI will impact existing applications, the potential for new applications, and the technology barriers that will guide their build vs. buy strategies.
When implementing AI, companies must carefully consider timing as well as demand. Is it better to build in-house AI capabilities or leverage commercial platforms? Can your current resources meet the demand? Or are opportunities remaining?
Three waves of AI evolution
Major technology disruptions are rare, occurring every 20 to 30 years. The last major phase ushered in the age of the Internet, which evolved through three distinct waves. The first was the creation of infrastructure, which established basic support for Internet development. For example, companies like Cisco and JDSU helped build the network infrastructure. The second wave focused on developing and managing software platforms and services built on Internet technologies such as Salesforce and Adobe, resulting in enterprise-level growth. The third wave introduced mobility and millions of applications tailored to the needs of end consumers across a variety of sectors.
AI is currently a mega-technology disruptor and appears to be following a similar path across the infrastructure, enterprise, and application stages. So where are we now? Are we in the midst of an explosion of LLMs being generated from high performance computing (HPC) data centers, and are we at the beginning of a wave of infrastructure, whether it’s GPUs or custom ASICs? Demand for semiconductor chips for computing is rapidly increasing. If so, will the second and third waves of AI, i.e. enterprise integration and edge applications, arrive sooner than the Internet era? Given the speed at which AI is advancing, will these stages occur at a faster rate? It has the potential to transform industries even faster than the Internet.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs, and technology executives. Are you eligible?