Home Top Ad

Responsive Ads Here

The emerging landscape of AI PCs is set to transform how we work with our devices whether PC or laptop. There are both positives and negati...

AI PCs. Is it all hype, should you switch, and if so, when?

The emerging landscape of AI PCs is set to transform how we work with our devices whether PC or laptop. There are both positives and negatives around this new technology and before embarking on an adoption program, it’s important to know the facts.

What is an AI PC?

AI PC’s feature a dedicated Neural Processing Unit (NPU) within the System on Chip (SoC) which handles AI applications, experiences and the technology that runs computational power, whether it's language models, focused tasks, security or privacy. One of its greatest benefits is that it offers low latency and enables greater personalization which meets the growing need for more autonomy.

AI PC is a new variant of edge computing, where computation is done near the data source or near the end user, instead of depending only on the cloud. That mixed approach binds the strength of the cloud for intensive tasks and the speed and privacy advantages of local processing. AI PCs demonstrate this by using local hardware such as GPU's and NPUs for AI tasks, hence lowering latency, saving bandwidth and improving data security by reducing the amount of sensitive data that is sent to the cloud. The overall effect is a better user experience, supporting a variety of real time analytics and AI development.

Analyst assessment

Given that 2024 has been marked by industry commentators as the year of the AI PC, it is interesting to look at the landscape through the lens of analysts. Gartner, for example, predicts that 54.4 million AI PCs will be shipped this year, while IDC states 50 million and Canalys uses a slightly different measure but believes 1 in 5 shipments will be AI PC. Looking ahead to 2025, Gartner estimates that 43% of all AI PC shipments will be AI PC, but both IDC and Canalys forecast that by 2027, the figure will have risen to 60%. This spells a definite market shift in the direction of AI PCs.

AI PC chipset progress

AI PC evolution has been dependent on bringing the hardware and processor together to support AI applications at the PC level and an early example of this system on chip approach was on the iPhone with the A11 bionic processor. Now, with the introduction of chiplets like the Intel Ultra Core processor, we have seen a new design of CPU to suit varied purposes. Instead of the traditional block CPU, we now have a tile-based CPU which allows one file to be allocated to the GPU – the compute tile for the processor – and the SoC – which includes the NPU – to support the AI engine. Chip manufacturers are now developing and releasing their solutions allowing AI PCs to become a realistic prospect for users.

Importance of combining CPU, GPU and NPU

Modern computing tasks require many different computational capabilities that are best met by the combination of CPU, GPU and NPU. The CPU is the central processing unit, a general purpose processor designed for sequential processing, which runs the operating system and the conventional apps we all like to use on our laptops. The GPU is the graphics processing unit, originally created for graphics rendering. This is equally effective at parallel computations, ideal for the type of matrix and vector operations that are essential for AI and deep learning. The NPU is the neural processing unit, a specialized processor developed specifically for AI tasks. The NPU efficiently speeds up neural network computations whilst maintaining low power consumption. 

This triumvirate enables flexible computing where each type of processor can be used for specific tasks leading to significant enhancements in performance and in energy efficiency. And these are not just being designed for PCs and laptops. CPUs, GPUs, NPUs and System on Chip, which includes all three components, enable an ever-increasing number of devices including smartphones and embedded systems in sectors such as manufacturing to realize the potential of AI.

Where do memory and storage fit in?

One of the biggest challenges to adopting AI PCs with confidence, is dealing with the lack of information and the myths around how much memory is needed to run laptops and PCs with AI PC chiplets. As things stand currently, there are no minimum specifications and it is common for systems to have 8, 16 and 32 Gb of memory. However, as applications develop further, and intelligent uses of AI PC become more demanding we would anticipate a shift in memory requirements.

The same applies to storage Some systems have 256 Gb of SSD storage whilst others have 1Tb or 2Tb. It’s important to think beyond your needs today, or even next year and beyond, and anticipate what future applications might demand, and what your storage and memory requirements will be.

Current use cases

Examples of where AI PC is being used are growing by the day. In business productivity Microsoft Copilot is breaking new ground, but equally popular are solutions like Zoom, Webex and Slack for project management. Jasper is a popular sales and marketing tool while the Adobe suite is ideal for media and creative tasks, Audacity for audio and GIMP for creative design.

Clearly, these tools are focused on communications and creativity, and they reflect the early stages of AI integration. They are applications with high demands and are an obvious starting point for the benefits of AI, where it makes an immediate difference in collaboration and content creation. For many users the initial approach involves using AI PCs, but not in isolation, with cloud AI counterparts still being part of the mix. As the autonomy and security benefits of AI PC applications on local servers becomes more important, this balance will shift.

With the development of the landscape, technology will become more advanced and accessible, and applications will diversify hugely. We should view the current focus areas as a testing ground for the capabilities of AI in terms of user acceptance. There will be a learning curve while users accept AI but during this, the foundations of AI are being laid across multiple industries and use cases.

Why local is good

The greatest advantage of running AI models on AI PCs is that all the processing is local, boosting security and privacy and allowing users to move away from the risks of moving or storing sensitive data in the cloud – or sending to public AI models. AI PCs have the potential to lower the chances of data breaches, or unauthorized access, and will ensure greater control over data protection regulations such as GDPR simply by keeping data on site.

In addition, locally operated models are more resistant to network related problems, which ensures that essential AI functionality remains accessible even if cloud services fail due to connectivity problems or cyber-attacks targeting cloud infrastructure.

Of course, local AI devices will still need strong security measures to protect against local cyber threats such as malware or physical tampering. A comprehensive approach needs to be taken to secure model training, data encryption, proper access control and continuous monitoring for potential threats.

Preparing for change

Before making the decision to migrate to AI PCs, first consider what your organization needs now, what is available today to meet that need, the applications that are needed to suit specific job functions and where you are in the refresh cycle. If, for example, you are prepared to be an early adopter with the full knowledge that applications for AI PC are currently limited, but suit your needs, you are well placed to transition. If, however, you are unlikely to refresh for another 3-4 years, it might be worth waiting until the technology and applications have evolved further.

Keeping close tabs on AI PC chiplets with key manufacturers such as AMD and Intel and understanding how storage is evolving to keep pace – DDR4 versus DDR5, for example – will help you identify the right moment to adopt AI PC in terms of applications, performance and costs.

Another important factor is internal preparation. Staff must be trained to fully optimize AI PC systems and to operate them within a cyber-secure environment. AI technology is changing quickly, and adoption requires a comprehensive strategy. One of the greatest challenges right now is the lack of skilled professionals who understand the implications of AI from all perspectives. Instead of scrambling to manage AI regulatory compliance once AI PCs are adopted, the best approach is to be on top of the policies and practices that will be required in advance, and to understand the resources that are needed internally.

A final word

As always with a new technology there are subtle trade-offs involved in the opportunities and risks of adoption. Early adopters who can benefit from the AI PC applications that are currently available, could have first-mover advantages over their competition. Other organizations will want to ensure they have the right systems and policies in place to support the adoption of AI PC. It is also worth considering a more nuanced approach, and “buying” yourself time by upgrading key components – as this technology rapidly evolves - rather than committing fully today and changing everything in one fell swoop.

But if you are buying an AI PC today, by ensuring that you can upgrade your storage or memory in the future, your hardware will be better equipped to run AI PC applications which can operate in tandem with existing AI applications in the cloud.

We list the best business computers.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro



from TechRadar - All the latest technology news https://ift.tt/jqg1LC0

0 coment�rios: