Normal view

There are new articles available, click to refresh the page.
Before yesterdayTrusted Reviews

What is a GPU? Everything you need to know

A brown-black GPU floating on a black background

The GPU is one of the most important components of any PC, as it’s not just required for high-end graphics performance, but also for being able to generate an image, whether it’s your Windows desktop or a 3D world in a video game.

In fact, a GPU isn’t just important for your computer, as even your smartphone, tablet and smartwatch will have some form of GPU.

Whether you’re looking to upgrade your PC, or simply want to learn a little more about your device, this is our comprehensive guide on the GPU, covering why it’s so important and what factors you need to consider before a purchase.

What is a GPU?

A GPU, or a Graphics Processing Unit, is a computer component that can create images. Any device that creates images on a display (such as a laptop, tablet or smartphone) will have a GPU.

Without a GPU, a device would be unable to create any on-screen images, rendering it as useless as a doorstop.

What does a GPU do?

GPUs are responsible for creating images, scenes, and animations on a device, but having a more powerful GPU will accelerate the rendering of 3D graphics, making them useful for many different workloads.

While every smartphone, tablet and laptop will feature a GPU of some form, they’re mostly associated with gaming. That’s because you generally need a powerful GPU to run games at a smooth performance, as the PC component is tasked with creating lots and lots of animations in a matter of milliseconds.

GPUs are required to render multiple complex animations at the same time, which comes in handy not only for gaming but for creatives that are looking to render 3D models or edit video. For example, the MacBook Pro has a more powerful GPU than most laptops since it’s created specially for digital artists and designers.

GPUs are also used in data centres, which enables the likes of cloud gaming, video encoding and video streaming, as well as bitcoin mining.

Essentially, a GPU has a massive influence on everything you see on your screen, and the quality of a GPU is one of the biggest differentiators between the MacBook Pro and MacBook Air.

What is an iGPU?

There are two main types of GPUs, with an iGPU (or Integrated Graphics Processing Unit) being built inside the main processor rather than being a separate chip to the CPU.

These types of GPUs are generally less powerful than discrete GPUs, and are usually developed for basic rendering tasks and generic productivity work, rather than gaming and 3D animation.

The GPU inside the Apple M4 chip
Integrated GPU – Image Credit (Trusted Reviews)

That being said, companies such as AMD and Intel are working on improving the performance power of their iGPUs. Apple has also surprised many people with its Apple Silicon chips, particularly the Max and Ultra chips of its M range, which flaunt iGPUs so powerful that they can compete with high-end graphics cards.

It’s also worth noting that anyone looking to buy a new PC without a dedicated graphics card should check that it features an integrated GPU, as sometimes this isn’t the case. You’ll generally need either an iGPU or dGPU for your PC to run, as otherwise it will be incapable of generating any images.

What is a dGPU?

The second variation is the dGPU, also known as a Discrete Graphics Processing Unit. A dGPU is usually found inside a graphics card for a desktop system, though it can be a dedicated chipset in high-end laptops too.

A discrete GPU is usually a lot more powerful than an iGPU, as it specialises in rendering advanced graphics for the likes of content creation or gaming.

If you’re looking to build a gaming PC, a powerful dGPU is essential for rendering high-quality images and scenes. Right now, the biggest names in the dGPU industry are AMD and Nvidia, although Intel has also launched its own GPUs in the form of Intel Arc.7

Nvidia RTX 4090 discrete GPU
Discrete GPU – Image Credit (Trusted Reviews)

Discrete GPUs do come with a catch however, as they require a dedicated cooling system to prevent overheating and to maximise their performance, which is usually why gaming laptops are so much heavier than traditional laptops.

dGPUs also have a higher power consumption, which means that gaming laptops will usually need charging more often than devices that come with an iGPU.

Plus, discrete GPUs can boost the price of a laptop significantly, which is why they’re only usually recommended if you’re going to be gaming or creating content. It’s arguably the most expensive component of a PC.

Is the GPU the same as a graphics card?

While the terms GPU and graphics card are often used interchangeably, they are not the same thing.

A GPU is the chip that is doing all of the heavy lifting, and can be found inside a graphics card.

The graphics card is an ‘expansion card’ that enables the GPU to connect to the motherboard. Graphics cards will also have built-in fans to help cool down the GPU when under stress, as well as their own dedicated RAM.

In essence, the graphics card is the house that a GPU lives in, helping it to connect to the rest of the computer, in a similar relationship to a CPU and a motherboard – the former providing all of the processing power, while the latter sorts out the logistics.

Due to the size of a graphics card, they’re typically only found in desktop PCs. A laptop will instead feature a GPU chip that plugs directly into the motherboard, with each laptop manufacturer developing their own cooling system instead.

Is a GPU better than a CPU?

We wouldn’t say that the GPU is better than the CPU or vice versa, as they both perform different jobs. While a GPU is tasked with generating images for a device, the CPU is instead responsible for processing and sending out instructions to all of the other components in an electronic device.

Without a CPU, a GPU wouldn’t be able to perform its job, making both components incredibly important. Both will often be baked into the same chip, especially for small, portable devices such as a smartphone. But even when a CPU and GPU occupy separate chips, they’re still able to communicate with each other effectively.

Depending on your desired workload, you may want to target different performance levels for each chip. A CPU’s performance will affect almost every workload, although you only really need a high-end performance for complex workloads and when juggling multiple tasks simultaneously. Meanwhile, a powerful GPU will generally have a greater impact on gaming performance than the CPU, while it can also have a major influence on the likes of content creation.

The post What is a GPU? Everything you need to know appeared first on Trusted Reviews.

Asus ROG Ally vs ROG Ally X: What's the difference?

Asus has announced a new model of its handheld gaming PC, the ROG Ally X. 

Rather than acting as a direct sequel to the original ROG Ally, the new X model will instead be a more expensive option, with Asus planning to sell both versions simultaneously. 

But what kind of upgrades will you be getting by opting for the ROG Ally X? While Asus has officially unveiled all the details and key specs just yet, a report from The Verge has given us a sneak peek to highlight the key upgrades. Check them out below:

Improved battery life 

The key selling point of the Asus ROG Ally X is undoubtedly the extended battery life. Asus says that it was one of the most requested upgrades, and so has slapped in a larger battery pack so gamers can play on the go for longer. 

We don’t know the exact new battery life just yet, but Asus has suggested that it will be “way more” than a 30% to 40% improvement. Given that the current ROG Ally struggles to play AAA games with max settings for more than 90 minutes, we’re hoping that it will at least be able to hit the 3-hour mark. 

Of course, fitting in a larger battery pack will make the ROG Ally X a heavier portable than its predecessor, but that’s a fair compromise considering the battery life is one of the most important features of a portable gaming device. 

Boosted RAM and longer M.2 slot

Asus has confirmed that it won’t be upgrading the main chip for the Asus ROG Ally X, so it will still be powered by the AMD Z1 Extreme found inside the original ROG Ally. However, Asus said the portable should still get a performance boost, but in the form of more RAM.

Asus told The Verge that the ROG Ally X will offer more than the 16GB of RAM found with its predecessor. While not confirmed, we’re speculating that Asus could offer a 32GB RAM configuration, as any higher than that could feel a tad excessive. 

Asus has also said it will be expanding the M.2 SSD slot, making it easier for you to find an SSD to upgrade the storage if you wish to do so. This is great to hear, as you’re currently restricted to very short drives, which restricts your options and inflates the price. 

Asus ROG Ally
Asus ROG Ally – Image Credit (Trusted Reviews)

New placement for SD card reader 

Many existing Asus ROG Ally owners have reported issues with the SD card reader, with the port seeing unusually slow read speeds or even sometimes failing to recognise any inserted card. It has been suggested the close proximity to the portable’s vents could potentially be the cause of this.

With the ROG Ally X, Asus has moved the placement of the SD Card reader away from the vents, which should hopefully rectify the issue. 

New black colour option 

The most striking difference to the design of the Asus ROG Ally X is the new black colour option, helping it to be visibly distinctive compared to the original. We welcome the new black colour, as it’s generally better at hiding scuff marks and the like. 

Asus has also said its new portable will be more repairable this time around, with interchangeable joystick modules and retuned triggers, joystick and D-pad. We’re yet to see the entire design of the Asus ROG Ally X, so there may well be additional changes, but we expect it to mostly keep in line with the original. 

The post Asus ROG Ally vs ROG Ally X: What's the difference? appeared first on Trusted Reviews.

Ctrl+Alt+Del: The Asus ROG Ally X has solved PC handhelds' greatest flaw

OPINION: This week, Asus confirmed that it will be launching the ROG Ally X, a new PC handheld system that will coexist with the original model as a more high-end option. 

Asus is keen to point out that the ROG Ally X won’t act as the sequel, as The Verge reports that the new model will still feature the same AMD Z1 Extreme and  7-inch 120Hz LCD screen. 

It’s admittedly disappointing that Asus isn’t upgrading to an OLED display, as it’s been a question mark over the ROG Ally ever since the introduction of the Steam Deck OLED. But on the other hand, I’m pleased to see that Asus will be addressing the great flaw of the ROG Ally, and handheld PC systems in general: battery life. 

Asus ROG Ally X
Asus ROG Ally X

I’m a big fan of the original Asus ROG Ally. It’s powerful enough to play all of my favourite PC games, and is remarkably light considering the specs it’s packing. However, all of that portable power arguably goes to waste when you consider how poor the battery is. 

When testing the handheld with modern AAA games, the ROG Ally was only capable of lasting around 90 minutes when set to Performance mode. That’s poor stamina, and sadly not long enough to last long long trips on a plane or train. 

This isn’t a problem exclusive to the ROG Ally though. I found that the Ayaneo 2S only lasted 100 minutes in the same test, while the Lenovo Legion Go was only able to hit two hours. The Steam Deck OLED is the best of the bunch in this regard, but even then you need to make compromises to performance settings to last more than three hours. 

Steam Deck OLED
Steam Deck OLED – Image Credit (Trusted Reviews)

Asus is opting to fix this problem with the upcoming ROG Ally X. With the official launch still a couple of months away, the company has been coy on exact figures, but when speaking to The Verge, Asus claimed there will be a significant improvement for battery life. 

“We’re not looking at 30 to 40 percent more capacity,” Asus SVP Shawn Yen told The Verge. “We’re looking at way more than that.” When pushed further, Asus even hinted that we can expect a battery life longer than three hours, which would double the stamina of the original. 

Since Asus isn’t changing the chip, it seems that this stamina boost will come from a larger battery pack. This will likely have significant consequences for the rest of the device, especially in terms of weight. 

Asus ROG Ally in hand
Asus ROG Ally – Image Credit (Trusted Reviews)

I don’t think that will be an issue though. The vanilla Asus ROG Ally only weighs 608g, which is even lighter than the Steam Deck. This leaves plenty of wiggle room for more heft, while still ensuring it’s portable enough to easily carry around. That’s a great compromise in my eyes, considering how important the battery life is for a gaming system designed to be used on the go. 

My biggest concern is that the ROG Ally X will come with an inflated price. The gaming handheld is already rather expensive at £599/$699, yet Asus has confirmed that the X model will have a higher price. For those who already own a ROG Ally, I’m not convinced that a longer battery life, boosted RAM and a M.2 SSD slot will justify an upgrade.

Nevertheless, the introduction of the Asus ROG Ally X shows that handheld PC makers are going in the right direction. Battery life should be one of the most important considerations when designing these devices, even if that results in compromises to performance and design. 

And with both Asus and Valve focusing specifically on this area with their latest revised models, I’m growing more confident that we’ll see even greater battery life improvements for future devices. If I can play my PC games for over five hours straight on the go, then my purchase will be guaranteed.


Ctrl+Alt+Del is our weekly computing-focused opinion column where we delve deeper into the world of PCs, laptops, handhelds, peripherals and more. Get it straight into your email inbox every Saturday by signing up for the newsletter.

The post Ctrl+Alt+Del: The Asus ROG Ally X has solved PC handhelds' greatest flaw appeared first on Trusted Reviews.

6 features the Nintendo Switch 2 must have for me to buy it

Nintendo Switch

OPINION: Nintendo has finally confirmed that the successor to the Switch will launch before April 2025, which means it’s only a matter of months away from release. 

The Nintendo Switch has been a huge success, currently ranking as the third best selling console of all time. This means that the Switch 2 has a lot to live up to, especially now that there’s far more competition in the handheld market following the release of the Steam Deck

With that in mind, I’ve created this list of the 6 features that the Nintendo Switch 2 must have in order to guarantee my purchase and become another success story. Check them out below:

Assassin's Creed Mirage
Enemies – Image Credit (Ubisoft)

Improved graphics performance 

I love the Nintendo Switch and still use it regularly, but there’s no denying that its Nvidia Tegra chip is starting to show its age. There have been performance issues for first-party games such as Pokemon Scarlet and Violet, while the third-party offering has dried up in recent years due to the Switch’s inability to keep up with the PS5 and Xbox Series X hardware. 

For example, Assassin’s Creed Mirage failed to materialise on the Switch, despite being playable on an iPhone 15 Pro smartphone. There’s also a growing number of third-party games on Switch that are only playable through the cloud, including Resident Evil Village, Hitman 3, Marvel’s Guardians of the Galaxy and Kingdom Hearts 3. 

I’m hoping that this cloud workaround will be a thing of the past, and the increased firepower of the Switch 2 should enable it to run more complex games. It should also open the door for first-party developers to be even more ambitious. 

While a handheld system is always going to be more restricted than a home console, I believe that making use of AI features such as DLSS should allow it to push up performance as high as a 1080p resolution while still maintaining a healthy battery life. 

Nintendo Switch in the dock

4K output to TV in docked mode

I’ve heard some Switch fans say they’re hoping for a 4K upgrade for handheld mode, but I’d disagree on that point. You don’t really need that high of a resolution for a small 7-inch screen to make a noticeable difference, and upping the pixel count this high would have a detrimental effect on battery life. However, there’s no reason why Nintendo couldn’t enable a 4K performance when the Switch 2 is connected to the TV. 

Right now, the Switch’s dock is essentially just a plastic docking station that makes it easy to hook up the portable to a TV. I think Nintendo should upgrade the dock to feature a powerful chip which could upscale the performance of the Switch when connected to the TV. This would theoretically enable the Switch to reach a 4K resolution on a TV without compromising on the battery life in handheld mode. 

Now that 4K TVs are steadily becoming the norm, especially following the release of the PS5 and Xbox Series X, it’s a good time for Nintendo to finally make the jump. I personally think the 1080p output to a TV can look a little pixelated at times, especially since I own a 55-inch television. Enabling 4K mode on the Switch’s dock would immediately solve this issue, and make the succeeding console feel like a substantial upgrade. 

joy-con

Hall-effect joysticks 

If you asked every Switch fan what is their most hated feature of the portable gaming system, it’s likely that Joy-Con Drift will rank very high. This is the term given to Joy-Con controller issue that will trigger unwanted inputs for the analogue sticks, potentially causing your in-game character to spin around in a circle or move in the wrong direction. 

Joy-Con Drift can be caused by excessive use of the analogue stick, to such an extent that the mechanical parts begin to erode away. Nintendo sadly hasn’t been able to fix this issue, even with the release of the Switch OLED edition, but this should hopefully be rectified with the Switch 2. 

The emergence of hall-effect joysticks has been revolutionary for controllers, as it uses magnet technology rather than physical contact to register your inputs, therefore greatly reducing the risk of wear and tear. If Nintendo were to implement the hall-effect technology into its next-gen Joy-Con, then it would immediately eliminate Joy-Con Drift for good.

super mario odyssey

Backwards compatibility

The Nintendo Switch has now been around for over seven years, and in that time, it has amassed a huge collection of games. It would be a huge shame if it wasn’t possible to play these games on the Switch 2, and so I have my fingers crossed that there will be support for backwards compatibility.

Having backwards compatibility on the Switch 2 would also ease the pressure on the new device to deliver a compelling line-up of launch games in the first year, allowing access to your existing Switch games.

I’m also hopeful that the Switch 2 could offer a performance or visual boost to older games. For example, if the Switch 2 supported 4K resolution in docked mode, I’d love to revisit games such as Tears of the Kingdom and Super Mario Odyssey in the enhanced resolution. 

I’m also hopeful that the Nintendo Switch Online service continues to offer all of the NES, SNES, N64 and Game Boy classics, as I’ve loved being able to dip into these retro games on the portable. 

Ratchet and Clank Rift Apart

Speedier loading times

One of my favourite PS5 upgrades is the transition to M.2 SSDs. This cutting-edge storage solution offers significantly speedier loading times, which has all but eradicated loading screens for modern games. 

I’d love to see this upgrade arrive on the Nintendo Switch 2, as loading screens can still be pretty lengthy, especially for open-world games such as Tears of the Kingdom. 

The PS5’s new SSD has also allowed for improved gameplay features, such as near-instant fast travel, and the ability to hop between game worlds in an instant, demonstrated best by Ratchet and Clank: Rift Apart.

Sony has arguably underutilised the breakneck speed of the PS5’s SSD, but I can’t imagine Nintendo doing so, as it loves to eke every drop of potential out of every new innovation, whether it’s the motion control of the Wii or the 3D effects of the 3DS. 

Nintendo eShop on Switch

Customisable UI

I’m a big fan of the user interface of the Nintendo Switch, as it’s simplistic and easy to navigate, especially since all of the games are organised in a row, organised by the most recently played. 

That said, I do wish that the Switch offered a tad more customisation. For example, I loved being able to set my own background on the PS4, offering more personalisation to differentiate your Switch from one owned by a sibling or friend. 

It’s an obvious win for Nintendo in my eyes, as it could not only sell fun wallpapers on its digital store, but also provide special edition variations through Amiibo as an extra incentive. 

The post 6 features the Nintendo Switch 2 must have for me to buy it appeared first on Trusted Reviews.

The Apple M4 feels wasted without new iPad Pro software

OPINION: This week, Apple shocked the world by unveiling yet another generation of Apple Silicon chips, this time in the form of the Apple M4 chip. 

The fact that Apple is launching a new chip isn’t surprising in itself, but rather the timing. After all, the Apple M4 was announced just seven months after the release of its predecessor, the M3 chip

Many have suggested that this short gap between generations is likely due to Apple’s eagerness to capitalise on the growing popularity of AI. All of the major technology companies, such as Microsoft, Intel, Google, Nvidia, AMD and Qualcomm, are investing heavily in AI right now, so Apple seemingly wanted to act quickly before being left behind. 

According to Apple, the M4 features the most powerful neural engine ever, capable of a stunning 38 trillion operations per second. For comparison, the preceding Apple M3 is restricted to just 18 trillion operations per second, which is less than half the AI performance of the M4 processor. 

The AI power of the Apple M4 is remarkable, but all of that power will be wasted if Apple doesn’t develop enough software to make use of the processor’s cutting-edge neural engine. AI performance differs from raw processing power, as it only impacts the performance of AI-powered workloads. Right now, there aren’t a huge number of AI features that Apple offers, at least not enough that will benefit from the added power that the M4 chip promises. 

When the M4 was first announced, Apple gave a couple of examples of AI features that could benefit. The first involves being able to isolate a subject from its background when editing a 4K video in Final Cut Pro. However, this is a feature that’s already available via the Google Pixel 8 smartphones, and is quickly becoming available in a wider range of devices. 

Apple M4 specs
Credit: Apple

Apple also says the AI power of the M4 allows iPad Pro users to automatically create musical notation by simply listening to someone play the piano. It’s an impressive feat, but it’s a feature that the vast majority of people will have no interest in. 

As of now, there isn’t really any Apple AI feature that’s enticing enough to make me consider an iPad Pro purchase, or any future Mac that will be upgraded to the new M4 chipset. 

This isn’t an exclusive issue to Apple. It’s currently the same problem for Windows devices, as the AI power of Intel’s 14th Generation processors currently feels underutilised, with a dearth of advanced AI features to whet the appetite. 

iPad Pro M4
Credit: Apple

That said, I find the issue more surprising with Apple, as the company is in the unique situation of having full control of both the hardware and software of its products. As a result, it should be easier for Apple to harness the full potential of its M4 chip via new software updates and apps. 

I expect Apple will eventually address this issue. In fact, the upcoming WWDC event on 10th June 2024 is the perfect opportunity for the company to release a slew of AI-powered features that will make the purchase of an iPad Pro far more compelling. 

Nevertheless, I still can’t help but feel the release of the M4-powered iPad Pro is a tad premature. The record-breaking AI performance may be an exciting development, but if you buy the tablet this month, you won’t really be able to benefit from its untapped potential. 

The post The Apple M4 feels wasted without new iPad Pro software appeared first on Trusted Reviews.

Apple M4 vs Apple M2: What's the difference?

It was only two years ago when Apple first announced the M2 processor, and yet it is starting to feel a little outdated following the recent launch of the new Apple M4 processor. 

But what kind of performance upgrade can you expect from the M4? And are there any other benefits for opting for the newer chip? This is a particularly important question if you’re deciding between the latest iPad Pro and its predecessor. 

We’ve created this guide to highlight the key differences between the Apple M4 and Apple M2 processors, so you’ll know exactly what you’re getting by spending extra. 

M4 has eight billion extra transistors

One of the biggest indicators for a processor’s performance is the number of transistors squeezed onto the chip. Thanks to the adoption of the new second-generation 3nm architecture, Apple has managed to increase the number of transistors on the M4 processor to a whopping 28 billion.

For comparison, the Apple M2 chip is based on 5nm architecture, and is therefore limited to 20 billion transistors. That’s still plenty enough for a fast performance, but the M4 has an extra eight billion transistors to push performance to new heights. 

Apple M4 chip
Credit: Apple

M4 offers 50% more processing power 

Apple claims that the M4 chip offers 50% more processing power than the M2 processor, resulting in a noticeably speedier overall performance. 

This extra processing power isn’t purely down to the upgraded architecture, but also to an increase in the number of cores. The M4’s CPU is made up of four performance cores and six efficiency cores, totalling up to an overall 10-core count. 

In comparison, the Apple M2 has four performance cores and six efficiency cores, and so therefore is restricted to a total CPU core count of eight. The extra cores in the M4 will allow it to better handle multiple workloads, which is particularly important for the likes of video editing. 

Hardware-accelerated ray tracing 

The Apple M4 chip has a new 10-core GPU, which is apparently 4x faster than the M2. It also flaunts exciting features such as Dynamic Caching and hardware-accelerated ray tracing. The latter feature enables the M4 to make use of highly advanced lighting, shadow and reflection effects in video games. It’s impressive to see the technology make its way to Apple devices, considering it was one of the most exciting features of the PS5 and Xbox Series X

The older Apple M2’s GPU also has a 10-core design, but it’s lacking cutting-edge features such as hardware-accelerated ray tracing. This means that M2-powered devices won’t be capable of the same photorealistic visuals as those with an M4 chip. 

Apple also claims that the M4 is more power efficient than the M2. Apparently, the M4 can deliver the same level of performance as the M2, but at half the power. This should go a long way to extending battery life, which has been one of the biggest selling points of the M processors. 

Apple M4 chip specs
Credit: Apple

M4 offers fantastic AI performance 

One of the most exciting elements of the M4 processor is its upgraded AI performance. The 16-core neural engine is now capable of 38 trillion operations per second, making it the most powerful NPU ever seen on a device. 

The Apple M2 also features a 16-core neural engine, but it’s only capable of 15.8 trillion operations per second, making it less than half as powerful as the M4. 

So why is AI performance so important? Apple says the new AI powers of the M4 chip allow users to isolate a person from their background in a 4K video with just a tap on the iPad Pro’s screen. It’s also able to automatically create musical notation in real time when listening to someone play the piano. The number of AI features and apps are only going to increase in the near future, which will make AI performance even more important going forward. 

The post Apple M4 vs Apple M2: What's the difference? appeared first on Trusted Reviews.

Apple M4 vs Apple M3: What's the difference?

Apple has announced the new M4 chip, the latest entry to the Apple Silicon family. 

This reveal has come as a surprise to many, with Apple only launching the preceding Apple M3 processor late last year. With so little time separating the two processors, you may be wondering how they differ. 

We’ve created this guide to highlight the key upgrades for the new Apple M4 chip compared to the existing M3 processor, so you know which processor is best for you. 

Powerful AI performance

The headline upgrade for the Apple M4 chip seems to be its advancement in AI performance. Apple has demonstrated new AI-powered features that the chip will make possible, such as isolating a subject from its background in a 4K video in an instant. 

This advancement in AI performance is thanks to the new neural engine inside the M4 chip, which Apple claims is the most powerful of its kind in the world. Apple’s neural engine has a 16-core design and is capable of 38 trillion operations per second. For comparison, the 16-core NPU of the Apple M3 is only capable of 18 trillion operations per second. 

Apple’s introduction of the M4 chip follows a big trend of doubling down on AI in recent years. AMD, Intel and Qualcomm are all in the process of rolling out AI chips, while Adobe, Google and Microsoft have also been developing AI-powered software.

If anything, it seems like Apple is actually a little late to the AI party compared to its competitors. But the M4 chip marks Apple’s biggest leap into the market yet, and it seems this is only the beginning. 

Apple M4 specs
Credit: Apple

M4 chip offers more processing power

The focus may well be on AI when it comes to the M4 chip, but that doesn’t mean Apple has neglected the raw processing power. Apple claims that the M4 offers 50% more processing power than the M2 chip. Unfortunately, Apple didn’t offer a percentage increase compared to the M3. 

Apple has upped the core count of the chip, with the M4 featuring 4 performance cores and 6 efficiency cores, totalling up to 10 cores. Meanwhile, the Apple M3 only features 8 cores, made up of 4 performance cores and 4 efficiency cores.

The performance boost is also aided by the move to a new architecture, using the cutting-edge second-generation 3nm process from TSMC.

This is still a baseline chip of course, with Apple expected to launch Pro, Max and Ultra variants of the M4 processor further down the line for its high-performance Macs.

The M4 chip is only available in the iPad Pro, for now… 

Apple usually introduces its new processors inside a Mac, with the M3 debuting inside the iMac and MacBook Pro late last year. However, the Apple M4 has bucked this trend by exclusively launching inside the iPad Pro. 

The new chip isn’t available in any Mac right now, which means the MacBook Air, MacBook Pro, iMac, Mac Mini and more are all still stuck on the previous M3 generation. 

There’s no need to worry though, as renowned Apple leaker Mark Gurman says that Apple is still planning to launch a slew of M4–powered Macs before the end of the year, most likely in October or November. 

But for now, if you want access to the new M4 chip and all of its AI powers right now, then your option is opting for the iPad Pro tablet. 

The post Apple M4 vs Apple M3: What's the difference? appeared first on Trusted Reviews.

High-end vs mid-range CPUs: Is the difference worth the cost?

Buying a new CPU for your PC can be an intimidating process, not only because of all of the technical specs that you need to consider, but also the high expense.

One of the most common questions that people ask is whether they should purchase a mid-range or high-end processor. The likes of AMD and Intel will always launch multiple chips for each generation, all with different price points. So which one should you go for?

We’ll be explaining everything you need to know about purchasing a CPU, so you can be more confident that you’re purchasing the right processor for your desired workloads. 

What is a high-end CPU?

Let’s start with defining what a high-end processor actually is. In terms of Intel chips, the i7 and i9 ranges are generally thought of as the high-end options. Over at AMD, it’s the Ryzen 7 and Ryzen 9 chips. But what exactly is it that makes these chips high-end? 

Intel Core i9-14900K in motherboard
Image Credit (Trusted Reviews)

Generally, high-end chips have more cores and higher peak clock speeds, making them the most powerful options. For example, the high-end Intel Core i9-14900K features 24 cores and is capable of up to a 6GHz frequency. Meanwhile, the mid-range Intel Core i5-14600K has 14 cores and a max clock speed of 5.3GHz.

Thanks to those extra cores, high-end processors are generally more equipped to deal with juggling multiple tasks at once, or engaging in complex workloads such as video editing and 3D animation. The higher frequency speed will also provide a boost to overall performance.

What is a mid-range CPU? 

A mid-range processor is generally just a slightly less powerful version of its high-end counterpart, as long as they’re part of the same generation.

Intel Core i5-14600K being tested in a PC
Image Credit (Trusted Reviews)

For example, any processor in the latest 14th generation of Intel Core chips will share the same architecture. But when comparing two chips from differing generations (14th Gen vs 11th Gen for example) there will most likely be fundamental differences to how the chip was built, which can greatly impact performance. 

So if you’re choosing between two processors from the same generation (such as the i9-14900K and i5-14600K) then the only differences will be the peak frequency speed and number of cores. 

Is the difference worth the cost?

It’s obvious that a high-end processor is capable of a more powerful performance than its mid-range counterpart (as long as they’re part of the same generation) but is the performance difference actually worth the cost? This all depends on the workload.

The greatest advantage a high-end processor will have over a mid-range one is a greater number of cores and threads. However, many applications aren’t designed, or strenuous enough, to take advantage of numerous cores simultaneously. 

For example, web browsing, video streaming and word processing are such simple tasks that you most likely wouldn’t notice much of a difference between the performance of an i5 and an i9 processor. 

Strangely enough, most video games don’t require the power of numerous CPU cores either. This means that an i5 CPU will often be nearly just as competent as an i9 chip when it comes to elevating gaming performance. As a result, we often recommend gamers opt for a mid-range processor instead of a high-end one in order to get the best value for money. 

The benchmark table above demonstrates the performance difference between the Intel Core i9-14900K and Intel Core i5-14600K. As you can see, the former offers a superior performance, but not by a significant margin for modern games – there’s only 4fps advantage for Horizon Zero Dawn, 1fps boost for Returnal and 3fps gain for F1 22.

A high-end processor can offer a greater performance advantage for games that aren’t too restricted by GPU performance. For example, the i9 saw a 31fps performance boost compared to the i5 for Dirt Rally, although the performance was extremely high for both chips.

It’s also worth mentioning that all of the tests above are for a 1080p resolution. Crank up the resolution to 4K, and you’ll see an even smaller difference in gaming performance between the chips. Instead, it’s the GPU performance that becomes more important.

So does that mean there’s no point in buying a high-end processor? Not quite. A high-end chip will generally excel at workloads that make use of multi-threaded performance. This includes tasks such as video editing, 3D animation, machine learning and more. We typically recommend a high-end processor for creative professionals who deal with complex productions on a frequent basis. For such workloads, you should notice a big performance difference compared to using a mid-range chip. 

So whether the high cost of a high-end chip is worth it really just comes down to your workloads. Most people, including gamers, will likely be perfectly fine with a mid-range chip. But if you’re someone who deals with CPU-intensive workloads on a consistent basis, then the added firepower of a high-end chip will be worth it in the long run. 

The post High-end vs mid-range CPUs: Is the difference worth the cost? appeared first on Trusted Reviews.

Ctrl+Alt+Del: Slow down Apple, it's too early for an M4 chip

OPINION: This coming Tuesday, Apple is holding yet another showcase event where it’s expected to unveil a new range of iPads. And if rumours are to be believed, the next iPad Pro will be powered by the Apple M4 chip. 

Apple analyst, Mark Gurman, reports (via Bloomberg) that Apple could launch an M4-powered iPad Pro at the Let Loose event. What’s more, he suggests that Apple will also upgrade the iMac, MacBook Pro and Mac Mini with the M4 chip later in the year.

This has all come as a big surprise, with Apple only introducing the M3 chip back in October 2023. If Gurman’s report is accurate that would mean there will only be a 7-month gap between the M3 and M4 generations. Some may applaud Apple for the speed it’s producing new chips, but I think this is more of a negative than a positive.

More of a negative than a positive

Most chip makers, such as AMD and Intel typically leave at least a 12-month gap between processor generations. This is frequent enough to ensure the brands are continuously pushing the envelope but also leaves customers enough time to purchase a new device without worrying that it will become outdated within a matter of months. 

Apple M1November, 2020
Apple M2June, 2022
Apple M3October, 2023
Apple M4May, 2024?

This is the same tactic that Apple uses for its iPhone launches, with new phones hitting stores every September like clockwork. When spending big bucks on the iPhone 15 Pro Max, Apple fans want to feel assured that this will be the ultimate flagship iPhone for at least the next 12 months. If there was any threat of a successor launching a few months later instead, then the appeal of buying the new phone would diminish. 

Unfortunately, Apple hasn’t been quite as structured with its Mac releases. In March this year, Apple launched a new MacBook Air powered by the cutting-edge M3 chip. It wasn’t a huge upgrade overall, but the new processor at least provided a welcome performance boost. However, just two months later, reports now indicate that the iPad Pro could benefit from an M4 upgrade. This would put the tablet in a strange position of being a more powerful option than Apple’s entry-level laptop.

If I had bought a MacBook Air in the last couple of months, I’d be pretty annoyed by the potentially imminent release of the M4 chip. Suddenly, the new laptop feels outdated, and I’d be wondering why Apple didn’t instead wait a couple of months to upgrade it with the more advanced chip. 

MacBook Air M3 angled on table
MacBook Air – Image Credit (Trusted Reviews)

Gurman suggests that Apple has decided to accelerate its computer processor upgrades in order to show off the M4 chip’s new AI capabilities. This makes a lot of sense, with Intel recently making huge strides in AI innovation, and Qualcomm making big AI performance claims about its upcoming Snapdragon X Elite chips. The likes of Microsoft, Nvidia and Adobe have also spent the last couple of years doubling down on AI advancements, while Apple has been surprisingly quiet on the AI front in comparison and has arguably been left behind by its rivals.

The Apple M4 chip will seemingly rectify that, with the iPad Pro acting as Apple’s very first AI-powered device. I’m excited to see what kind of AI features Apple will introduce, with the company in a perfect position to innovate due to its seamless integration of both software and hardware. 

However, Apple has long known about the rapid advancements and popularity of AI, so that doesn’t excuse its poor planning. The M3 has felt like a rather pointless generation to me, especially since it’s failed to materialise in several devices (including the Mac Mini and iPad Pro).

iPad Pro M2 no keyboard or pencil
M2-powered iPad Pro – Image Credit (Trusted Reviews)

Sure, it’s helped to boost sales of Apple’s Macs, but it’s probably left many new Mac owners feeling buyer’s remorse with the M4 seemingly arriving just a matter of months after. Maybe it’s the case of the M3 chip arriving too late as opposed to the M4 arriving too early, but it’s still an issue that needs addressing.

The move over to Apple Silicon has undoubtedly been a huge success for Apple, but if it wants to continue competing with the likes of AMD and Intel, it really needs to stick more faithfully to a structured roadmap. This way, customers can feel confident that their new Mac purchase will offer cutting-edge performance for at least the next 12 months, rather than feeling outdated in a matter of weeks.


Ctrl+Alt+Del is our weekly computing-focused opinion column where we delve deeper into the world of PCs, laptops, handhelds, peripherals and more. Get it straight into your email inbox every Saturday by signing up for the newsletter.

The post Ctrl+Alt+Del: Slow down Apple, it's too early for an M4 chip appeared first on Trusted Reviews.

How to optimise your gaming PC for better FPS performance

A high frame rate is highly important for PC gaming to ensure a smooth performance, especially for competitive games where extra frames can provide a significant advantage. 

One of the simplest ways to increase the frame rate performance is to purchase a more powerful graphics card, although this can be an expensive endeavour, and so isn’t a realistic option for many people. 

We’ve created this guide to help you improve your FPS performance for free, with some simple optimisations helping to maximise the power of your PC. Check out all of the steps below to find out how to boost your performance. 

And if you’ve tried all of the above and still can’t push up the frame rate to your desired level, then check out our Best Graphics Card, Best CPU and Best Gaming Laptop guides to consider a hardware upgrade. 

What you’ll need

  • A gaming PC

The short version

  1. Upgrade your graphics drivers
  2. Reduce the resolution 
  3. Activate DLSS or FSR
  4. Deactivate features such as ray tracing, Vsync and Anti-aliasing
  5. Ensure Game Mode is activated on Windows
  6. Configure your laptop’s power settings
  1. Step
    1

    Upgrade your graphics drivers

    The most important thing to do in order to optimise gaming performance is to update your graphics drivers. The likes of AMD and Nvidia are constantly releasing new firmware to keep its graphics cards optimised, so it’s important to keep your system updated. 

    If you own an Nvidia card, then make sure to download GeForce Experience, which will automatically download new updates for you. If you own an AMD graphics card, then download AMD Software: Adrenalin Edition instead. 
    Nvidia GeForce Experience

  2. Step
    2

    Reduce the resolution

    If your main priority is boosting frame rates, and you don’t mind if that means sacrificing visual quality, then reducing the resolution is a great way to boost performance. 

    Reducing the resolution will reduce the workload on your graphics card, allowing you to achieve a higher performance. This is why most eSport professionals will typically opt for a 1080p or 1440p resolution instead of 4K. 
    Lower resolution

  3. Step
    3

    Activate DLSS or FSR

    If you’re lucky enough to own an Nvidia RTX graphics card, then you’ll have access to its DLSS technology. This uses AI to boost the FPS performance of a game, and has been highly effective in the most recent range of GeForce graphics cards. The big caveat here is that not every game supports DLSS, and so is mostly reserved to newer titles. 

    If you don’t own an RTX card, then you should have access to AMD’s FSR technology instead. This isn’t quite as advanced as DLSS, but is still effective at boosting frame rate thanks to upscaling solutions, although can cause unwelcome artefacts to the game’s visuals. But if boosting frame rate is your priority, FSR is still a useful tool. 

    You can activate DLSS or FSR through a game’s settings menu. It’s worth using an FPS counter (available through Steam) to see what kind of effect they have on performance. 
    DLSS setting

  4. Step
    4

    Deactivate features such as ray tracing, Vsync and Anti-aliasing

    Graphics cards now make use of dozens of features to help video games look as realistic as possible. They’re excellent options for improving visuals, but they can often be a detriment to the FPS performance. Fortunately, you’ll be able to deactivate each feature through a game’s graphics settings menu. 

    Ray tracing is the big one to look for. This is a cutting-edge technology that improves the lighting, shadow and reflection effects in modern games, but it’s extremely taxing on the GPU, so deactivating it should see your performance improve. Some games will have it turned on by default, so it’s worth having a look. 
    Vsync is also a great feature for preventing screen tearing, but will restrict your FPS performance somewhat. Anti-aliasing will often by on by default too, as it helps to smooth out ugly jagged edges for in-game objects. Turning off both of these features should help to improve your PC’s performance. 

    It’s also worth checking whether you can reduce the draw distance in a game. This determines how much of a 3D environment will be rendered at one given time, with a high draw distance allowing you to view in-game environments and objects from a greater distance. Reducing this will of course ease the pressure on your GPU and increase performance. 
    Ray tracing deactivate

  5. Step
    5

    Ensure Game Mode is activated on Windows

    Game Mode is a highly useful tool built into Windows which will automatically disable background tasks on your PC while you’re playing a game. This will dedicate all of your PC’s resources to your game to ensure the fastest performance possible. 

    Game Mode is generally enabled by default, but it’s still worth double checking as it’s an easy fix. Simply search Game Mode in the Windows search bar, and slide the toggle to On. 
    Windows Game Mode

  6. Step
    6

    Configure your laptop’s power settings

    This step is specifically for gamers using a laptop or a handheld device, as it involves boosting performance to the detriment of your device’s battery life. On a Windows device, you can do this by going to Settings > System > Power & Battery > Power Mode, and then selecting Best Performance. 

    Certified gaming laptops will sometimes even have a Turbo mode which not only maximises performance, but also increases power to the fans to keep your system as cool as possible during such workloads. You will of course get an even better performance by making sure your portable is plugged into a power source.
    Power modes laptop

Troubleshooting

Which PC parts increase FPS?

The graphics card (or GPU) is the biggest influencer on FPS performance, so it should be at the top of your list of parts to upgrade. That said, upgrading your CPU and RAM can also have a positive effect on performance.

Is 60 FPS good for gaming?

A 60fps performance is perfectly fast enough for the vast majority of gamers. This is the performance you can expect from a PS5 or Xbox Series X. That said, those who want a competitive edge in the likes of first-person shooters will see the benefit of increasing the performance beyond 100fps.

The post How to optimise your gaming PC for better FPS performance appeared first on Trusted Reviews.

How to benchmark your PC

Asus ROG Zephyrus G14 (2024) front

There are multiple reasons why you might want to benchmark a PC. You may simply be curious about its performance, or maybe you want to find a bottleneck to see which part of your PC should be upgraded next. 

We use multiple benchmarks on a weekly basis to evaluate the performance of the latest PC that we’re reviewing. You can check out our benchmark scores in all of our laptop, desktop PC and component reviews.

But if you’d like to benchmark your own PC, and see how it compares to alternative options, then check out our step-by-step guide below. 

What you’ll need

  • A PC
  • PCMark 10 (free)
  • Geekbench 6 (free)
  • 3DMark (free)
  • CrystalDiskMark (free)

The short version

  1. Test general performance with PCMark 10
  2. Test CPU performance with Geekbench 6
  3. Test GPU performance with 3DMark Time Spy
  4. Test SSD performance with CrystalDiskMark
  1. Step
    1

    Test general performance with PCMark 10

    The first test we recommend is PCMark 10. This benchmark evaluates your PC’s overall performance for a wide range of workloads, including day-to-day tasks, productivity applications, and more demanding work with digital media. 

    PCMark 10 isn’t an ideal benchmark for isolating a performance bottleneck, but it is useful for getting a quick general overview of your PC’s performance. You can download the PCMark 10 demo for free on Steam, which offers unlimited runs with the main PCMark 10 test.

    PCMark 10 will give you a numerical score at the end of the benchmark, which is rather useless in isolation. However, you can compare the number with other scores online, including all of our reviews of the latest laptops and desktop PCs. PCMark 10

  2. Step
    2

    Test CPU performance with Geekbench 6

    If you want to measure CPU performance specifically, then we recommend Geekbench 6. This determines both the CPU’s single-core and multi-core performance. The former is important for overall performance and gaming, while multi-core performance is more important for multitasking and more creative workloads such as video editing and 3D animation.

    Geekbench 6 is cross platform, and so works on both Windows and Mac. You can download it for free here. Again, it will provide a numerical score, which you can compare to our own results in each of our computing reviews. 

    Geekbench 6 is designed to simulate generic workloads such as basic word processing and browsing the web. If you want to test your CPU on a more intensive workload, then we suggest trying out Cinebench instead, which is also free to download and use. Geekbench 6

  3. Step
    3

    Test GPU performance with 3DMark Time Spy

    If you’re keen to find out how your PC can handle graphics-intensive workloads, such as gaming and content creation, then we recommend using 3DMark Time Spy. This evaluates the performance of your GPU specifically. 

    3DMark Time Spy is free to run using the free demo of 3DMark over on Steam. Simply download the software and run the benchmark to get a numeric score, which you’ll be able to compare to the results in our reviews. If you don’t own a dedicated GPU/graphics card, then it’s likely you’ll get a relatively low score here, but that shouldn’t be an issue unless you do like to game on PC, or consider yourself a professional digital creator. 

    Blender is a good alternative GPU benchmark if you want to evaluate performance specifically for content creation rather than gaming. 
    3DMark

  4. Step
    4

    Test SSD performance with CrystalDiskMark

    The SSD performance is often overlooked compared to other components, and yet its performance is incredibly important for several workloads, from booting up the PC to loading up applications and speeding through loading screens in a video game.

    CrystalDiskMark is our go-to SSD benchmark, as it measures both sequential and random performance. If you just want a generic look at the SSD performance, then it’s fine just to take the top two scores. The read performance determines how fast your PC is at loading up data (such as an app or game), while the write performance determines the speed of saving data onto your PC (such as installing software). 
    CrystalDiskMark

Troubleshooting

How can I benchmark my PC for free?

All of the benchmark software listed above are available for free. Many do have paid-for tiers that unlock extra benchmarks, but most people won’t need that level of depth.

What is a good PC benchmark score?

This is subjective and differs between benchmarks. For example, PCMark 10 suggests at least a score of 4100 for basic tasks. Check each benchmark’s website for their recommendations, but also compare with other systems using data found online, including our computing reviews.

The post How to benchmark your PC appeared first on Trusted Reviews.

How to find the right laptop charger

Microsoft Surface Laptop 5 (1)
Microsoft Surface Laptop 5 - Image Credit (Trusted Reviews)

You’d think that purchasing a new laptop charger should be a relatively simple process, but that’s sadly not the case, as a fair amount of research is required. 

Purchase the wrong charger, and it may not deliver enough power to charge up the laptop at a sufficient speed. It’s also possible that your charger won’t even fit into your chosen laptop, making it a pointless purchase.

To help ensure you avoid these potential problems, we’ve created this guide to help you find the right laptop charger. We may even help you save some money, as sometimes a universal charger can be cheaper than the default option bundled with your PC. 

So if you’re looking to buy a new laptop charger, make sure to keep on reading so you maximise value for money. 

Check the connection 

The very first thing you need to consider when buying a laptop charger is whether it uses the right connection – otherwise, your charger won’t even be able to plug into the laptop. 

USB-C is gradually becoming the default charging connector for laptops, and looks like an oval. It’s the same connector used on modern smartphones, which makes it handy for using one universal charger for multiple devices.

What is USB C?
USB-C

Not every laptop uses USB-C though, as many still use the pin design, which can vary in design and size depending on the manufacturer. This connection will likely look like a symmetrical circle, and is often used by power-hungry portables such as gaming laptops. 

If your laptop uses a pin, then it’s probably best to be on the safe side and purchase the charger directly from the manufacturer. In order to find out the specific laptop model you own, you can find your laptop’s serial number on the underside of the chassis.

If you’re keen to get a third-party charger, as it most likely will be cheaper, then make sure to visit your laptop manufacturer’s website to check the exact specification of the pin. 

MacBook Air M3 magsafe port
Magsafe – Image Credit (Trusted Reviews)

And then there are also a number of proprietary chargers from select manufacturers. For example, Apple now uses MagSafe chargers for its latest MacBook laptops, while Microsoft continues to use Surface Connect. 

Determine the wattage of your laptop 

The second step is to make sure that your charger is powerful enough to meet the wattage requirements of your laptop. If the charger wattage is too low, it may still power the laptop, but at a slower rate – maybe to such an extent that you won’t be able to use the laptop as it charges. 

This means that smartphone chargers are usually not powerful enough for your laptop, which is a shame when they’re both using the same USB-C connection. 

The wattage of a laptop should be printed in small text on the charger that was bundled in the box. For example, when inspecting the text on our Samsung laptop, we can see that it says 140W. 

Samsung Galaxy Book 3 Ultra laptop on wooden table.
Image Credit (Trusted Reviews)

If you’ve lost or misplaced your laptop charger and are seeking a replacement, then you’ll need to visit the manufacturer’s website instead. In our case, Samsung listed the wattage on the official specs of the laptop, but this won’t always be the case. You may need to find a manal or visit customer services instead. 

If you can’t find the exact wattage, then it’s better to go higher than lower. A higher wattage won’t cause any damage to your laptop, although the charging speed won’t get an extra boost than what the default charger provides. That said, a higher wattage will likely be reflected in the price, so don’t expect it to be cheap. 

Conclusion 

When purchasing a laptop charger, you have two main options: pick the official charger from the same manufacturer as your laptop, or opt for a third-party option. 

The official charger will likely cost a lot of money, but you’ll be safe knowing that it will work without issue. Purchasing a third-party charger can often be cheaper, but you’ll need to do your own research on the connection and wattage to make sure it works correctly with your laptop. There’s less risk going for a USB-C laptop charger than one with a pin design. 

The post How to find the right laptop charger appeared first on Trusted Reviews.

How to mute someone on Instagram

The instagram logo on a pink background

Everyone likely has that friend or family member on Instagram who makes annoying daily posts. Blocking someone can feel like too much of a drastic step, making the mute function a far more preferable option. 

Muting someone won’t stop you following an account, therefore avoiding any awkward confrontations, but instead simply hides their posts from your timeline. You even get the options to mute posts, stories and/or notes. 

Reversing the process is very easy too, so it’s possible to mute someone just for a few weeks if you do want to see their posts again in the future. 

The process of muting accounts on Instagram is very simple to do, and we’ve detailed a step-by-step guide below to help you out. This guide is for use with a smartphone, but you can check out our Troubleshooting section further below for our guide while using a PC. 

What you’ll need

  • An Instagram account
  • A smartphone or PC 

The short version

  1. Head over to the Instagram account you want to mute 
  2. Click on Following 
  3. Select Mute
  4. Choose to Mute posts, stories and/or Notes 
  1. Step
    1

    Head over to the Instagram account you want to mute 

    We located the Instagram account by clicking on the magnifying glass icon on the bottom taskbar, and then using the search bar. However, you can also navigate to an account by simply pressing on its profile picture if you spot it on your timeline. 
    How to mute someone on Instagram

  2. Step
    2

    Click on Following 

    The Following button can be found underneath the main profile picture on the account. It sits to the left side of the Message button. 
    How to mute someone on Instagram

  3. Step
    3

    Select Mute

    A new pop-up window should appear with the Mute option located on the third highest row. Click on it.

    How to mute someone on Instagram

  4. Step
    4

    Choose to Mute Posts, Stories and/or Notes 

    You’ll now get the option to mute Posts, Stories and Notes. You can either select all of these options, or toggle them individually. This may be useful if you don’t like their posts popping up on the timeline, but don’t mind having the option to view their stories.

    And if you ever want to unmute the account, you simply need to click on the toggle again by repeating the same steps. 

    How to mute someone on Instagram

Troubleshooting

How do I mute someone on Instagram using a PC?

Muting someone on Instagram via PC is largely the same as on a smartphone. Simply visit the account you want to mute, select Following at the top, select Mute and then click on Posts and/or Stories.

Will people know that you have muted their account?

There is no official way for someone to find out that you have muted them on Instagram. However, they may be able to work out that you have muted their Stories if you never show up as a viewer despite being active.

The post How to mute someone on Instagram appeared first on Trusted Reviews.

How to check CPU core usage on Windows 11

Checking your PC’s usage of each CPU core is a good way to determine how your system is dealing with the current workload. 

This can help inform you whether you have a powerful enough processor to deal with the task or hand. Most simple tasks will only require the performance of a single CPU core, but more complex workloads, such as video editing or gaming, will require more resources. 

It’s difficult to determine which components of your PC are causing a performance bottleneck, but checking the CPU core usage can go a long way to finding the answer. 

In this guide we’ll be detailing how to find the usage of each of your CPU cores when running any tasks on your Windows PC. 

What you’ll need

  • A Windows PC

The short version 

  1. Open Task Manager
  2. Click on the performance tab
  3. Right-click on the graph
  4. Hover over ‘Change graph to’ and select Logical processors 
  5. Check out the detailed breakdown of your CPU
  1. Step
    1

    Open Task Manager

    We did this by searching Task Manager in the Windows search bar, but you can also use the keyboard shortcut Ctrl + Shift + Esc.

    How to check CPU core usage on Windows 11

  2. Step
    2

    Click on the performance tab

    The processes tab should open by default, listing all of the apps and current processes that your PC is running. To check your CPU core usage, click on the Performance tab in the left-hand column.
    How to check CPU core usage on Windows 11

  3. Step
    3

    Right-click on the graph

    You should now see a graphic detailing the overall utilisation of your CPU, with metrics such as the speed listed underneath. For a breakdown of the usage per core, right-click on the graph.
    How to check CPU core usage on Windows 11

  4. Step
    4

    Hover over ‘Change graph to’ and select Logical processors

    After right-clicking, a pop-up box should appear. Hover over the ‘Change graph to’ option listed at the top, and then click on the option for Logical processors.
    How to check CPU core usage on Windows 11

  5. Step
    5

    Check out the detailed breakdown of your CPU

    Once you’ve completed the prior step, you should notice that the graph has now been broken into multiple windows, with the number depending on how many threads your CPU has. In our laptop’s case, the 22 threads are represented by 22 windows, all providing a snapshot of their current utilisation.
    How to check CPU core usage on Windows 11

Troubleshooting

How to check how many cores your CPU has

If you want to find out how many cores your PC’s CPU has, then go to Task Manager > Performance. You should see the number of cores listed below the graph.

Why aren’t all of the cores being fully utilised?

You may notice that only one core is working hard, while the rest remain dormant. This is nothing to be concerned about, as most basic workloads on a Windows PC do not require the utilisation of multiple cores. You only really need a large number of cores running at full hilt when running complex tasks such as video editing and 3D animation.

The post How to check CPU core usage on Windows 11 appeared first on Trusted Reviews.

Snapdragon X Plus vs Snapdragon X Elite: What's the difference?

Qualcomm is gearing up to take on Apple, AMD and Intel in the laptop space this year, with the launch of the new Snapdragon X Elite and Snapdragon X Plus chips. 

Both of these new laptop chips are based on Arm architecture, as the company hopes to replicate the success of Apple Silicon

But what are the differences between the Snapdragon X Plus and Snapdragon X Elite? We’ve created this comparison guide to highlight the key specs.

Snapdragon X Elite has more processing power

We haven’t had a chance to test either of these laptop processors just yet, but by judging the specs sheet, it looks like the Snapdragon X Elite will offer more processing power. 

The Snapdragon X Elite features 12 cores up to a frequency speed of 3.8Ghz, whereas the Snapdragon X Plus packs in 10 cores up to 3.4GHz. 

This means that the Elite not only has two extra cores to aid with multi-tasking, but can also reach a higher clock speed, which will have a big impact on overall performance. 

Qualcomm has also announced that both chips will support up to 64GB RAM, although this will be dependent on the configuration options for each laptop. 

Snapdragon X Elite has a stronger GPU performance 

The two Snapdragon processors feature an Adreno GPU, which will be important for graphics-intensive workloads such as gaming and content creation. 

However, the specs of the Adreno GPU do differ between the two Snapdragon processors. For the Snapdragon X Plus, the GPU performance is capable of up to 3.8 TFLOPs. As for the Snapdragon X Elite, it has three different flavours depending on which laptop you go for, with two limited to 3.8 TFLOPs, and the third capable of up to 4.6 TFLOPs. 

This means the Snapdragon X Elite has a higher performance ceiling when it comes to graphics performance. It’s important to note that these are just integrated graphics though, and so are unlikely to be able to compete with discrete options from Nvidia and AMD. Expect it to be more of a rival to the Apple M3 chip rather than components designed for gaming machines. 

Snapdragon X Plus AI performance
Credit: Qualcomm

Snapdragon X Plus shares the same AI performance 

Laptop makers are investing a lot into AI this year, so it’s no surprise to see that Qualcomm is making sure its new Snapdragon chips are also up to snuff in this area. Both chips feature a Qualcomm Hexagon NPU to maximise AI performance. 

Surprisingly, the Snapdragon X Plus seemingly matches the Snapdragon X Elite for AI performance, with both offering up to 45 TOPS (Trillions Operations per Second). Qualcomm claims this is the world’s fastest NPU for laptops.

What does this all mean though? A high AI performance will speed up processes that utilise artificial intelligence. Qualcomm confirms that its chips will support Windows Studio Effects, which can help to blur your background and block out unwanted sounds during a video call. Plenty of more AI-powered apps are expected to arrive in the near future too, putting Qualcomm in a very strong position. 

Both processors support Wi-Fi 7

Wi-Fi 7 is the latest wireless technology, offering peak rates up to 40 Gbps when using the 6GHz band. It also supports new features such as Multi-Link Operation, which allows users to use multiple bands to connect to a single device in order to maximise performance and efficiency. 

Wi-Fi 7 devices are still thin on the ground, but Qualcomm has confirmed that its new chips will support the new wireless technology. Of course, you will need a Wi-Fi 7 router in order to reap these benefits, which still costs a lot of money. 

Fortunately, there will be backwards compatibility for Wi-Fi 6E and Wi-Fi 6 too, so your laptop will still be able to make use of older wireless setups before you eventually make the upgrade. 

The post Snapdragon X Plus vs Snapdragon X Elite: What's the difference? appeared first on Trusted Reviews.

Why does one CPU core work harder than others?

Modern CPUs are now capable of featuring a remarkable number of cores, pushing up the performance ceiling to all-new heights.

For example, the Intel Core i9-14900K desktop processor is packed with 24 cores, making it far more powerful than the more basic dual-core quad-core chips.

However, anyone with a PC may have noticed that most computer-based workloads will only utilise a single core. That may prove frustrating for those who have invested so much money into a processor with so many cores, so why does this happen? 

We’ve created this guide to explain why one CPU core will generally work harder than the test, and why this shouldn’t be cause for concern for your PC. 

One core to rule them all

If you check your CPU performance via Windows task manager, you’ll likely see that the vast majority of your processor’s workload has been given to Core 0, rather than splitting them evenly across all of the available cores. 

This is because most workloads don’t require the use of multiple cores. Basic tasks such as browsing the web or using a single application are not very intensive on the processor, and so will usually only require the performance of a single core. 

It would be entirely possible for the likes of Windows to distribute the performance more evenly, but upping the frequency of multiple cores will cause the chip to consume more power, which is wasteful if a single core could comfortably cope with the same workload.

It’s actually fairly rare for an application to make use of all of your cores, especially if you have a 24-core desktop chip. Even intensive workloads such as gaming probably won’t make full use of all of your cores. 

Wasted potential

The vast majority of games only require a small number of CPU cores in order to run on your PC. 

For example, Cyberpunk 2077 is one of the most technically intensive games available, and yet recommends a quad-core Intel chip for the minimum requirements. The recommended specs list a more powerful 12-core processor instead, but it’s still a far cry from the 24-core chips that are now available. 

Cranking up the CPU count is unlikely to cause a significant performance boost either. Sticking with Cyberpunk 2077 at a 1080p resolution, we recorded a 224fps average performance with the 14-core Intel Core i5-14600K chip, while upgrading to the 24-core Intel Core i9-14900K saw an average performance of 221fps. While this looks like a drop in performance, this is likely just a minor fluctuation, instead indicating that the two chips are capable of an almost identical performance for this specific game. 

Cyberpunk

Cyberpunk 2077 isn’t designed to make use of more than eight cores at a time, so additional cores are effectively wasted here. This means that, as long as you have at least eight cores, the single-core performance will generally be the biggest determining factor to determine how influential a CPU will be on your PC’s gaming performance. 

Like with most games, Cyberpunk is more reliant on the GPU when it comes to performance. Upgrading your GPU will likely have a far bigger effect on performance than upgrading to a processor with more cores. There are certain games that are more CPU intensive than usual, such as the Total War and Civilization series, but GPU performance remains the most important factor. 

Multi-core fun

After establishing that most computer workloads don’t actually require a large number of cores, you may be wondering what the point is in investing in a 24-core desktop chip. Well, there are specific workloads that do make use of such processing power. 

Multiple cores are more useful for applications that are likely to run multiple complex processes simultaneously. This includes workloads such as video editing, 3D animation and batch processing. 

Adobe claims that Premiere Pro runs at 93-98% efficiency with eight cores, while a whopping 32-core setup is suggested for demanding workloads in After Effects when taking full advantage of Multi-Frame Rendering.

Adobe Premiere Pro

If you find yourself running these sorts of applications, you’ll likely find your CPU utilising more cores than usual in order to deliver the required performance. Performance is likely to scale up when using more cores with such apps, whereas that isn’t the case with more simple processes.

And while we’ve established that video games don’t necessarily require a lot of cores, it’s a different story if you want to record footage or broadcast live gameplay, as more cores are required to perform all of these different tasks simultaneously.

With all of this in mind, you should really consider how many cores you’ll likely need from a CPU before buying one. If you’re going to be sticking to basic tasks, then it’s likely you’ll be wasting the potential of the majority of cores in a high-end processor. But the extra cores may well come in handy when dealing with more complex workloads or running multiple processes at one time.

The post Why does one CPU core work harder than others? appeared first on Trusted Reviews.

How to reset your graphics driver on Windows

Windows 11
Image Credit (Microsoft)

Resetting your graphics driver is a useful option for a number of scenarios for your Windows PC.

If you’re encountering odd screen flickering, or have noticed that your graphics card is behaving erratically after an update, then resetting your graphics driver can sometimes be an easy fix to get your PC working correctly again.

The quickest way to reset your graphics driver on Windows is to use the keyboard shortcut Ctrl + Windows + Shift + B. After hitting this key combo, you should hear a beeping noise, before your display goes black and then pops back into view. 

We’ve also detailed the manual method below, just in case the keyboard shortcut isn’t working for you. 

What you’ll need

  • A Windows PC

The short version

  1. Open Device Manager 
  2. Expand Display Adaptors 
  3. Right-click on your graphics driver and click on Properties 
  4. Click on the Driver tab
  5. Select Disable Device and then Yes
  6. Select Enable Device 
50GB SIM for just £8

50GB SIM for just £8

Smarty just beefed up its 16GB SIM, instead letting you get 50GB for the first 15-months of your contract at no extra cost. That’s some of the best value for money you’ll find on a data plan right now.

  • Smarty
  • Limited time offer
  • Just £8/month
View Deal
  1. Step
    1

    Open Device Manager

    You can locate this by typing Device Manager into the Windows search bar found in the Taskbar.
    How to reset graphics driver

  2. Step
    2

    Expand Display Adaptors 

    This should be the sixth option down. You can expand Display Adaptors by pressing on the small arrow underneath. How to reset graphics driver

  3. Step
    3

    Right-click on your graphics driver and click on Properties 

    The name of your GPU should now appear to indicate your graphics driver. Right-click this, and then select Properties at the bottom of the pop-up window. How to reset graphics driver

  4. Step
    4

    Click on the Driver tab

    At the top of the pop-up window, you should find a row of tabs. Click on the second one along, which should be labelled as Driver. 
    How to reset graphics driver

  5. Step
    5

    Select Disable Device and then Yes

    Near the bottom of the window, you should find the option for Disable Device. Click on this. You’ll then get a warning pop-up message. Confirm your decision by pressing Yes. Your screen will now go black for a couple of seconds, before reappearing. You may notice that your display now looks a bit odd, but there’s no need to worry. How to reset graphics driver

  6. Step
    6

    Select Enable Device 

    Once your display returns after a momentary pause, then make sure to click on Enable Device. Your screen should then go black again, but will then return to normal. Your graphics driver has now been reset, and hopefully your display issues fixed. How to reset graphics driver

Troubleshooting

The display issues haven’t been fixed

If you’re still experiencing display issues despite resetting your graphics driver, then it’s worth uninstalling your drivers (you can find the option on Step 5) and then visiting the relevant manufacturer website to install the latest drivers for your specific GPU. Updating your PC is also worth a try.

What is the keyboard shortcut to reset your graphics driver?

Using the keyboard shortcut Ctrl + Windows + Shift + B will reset your graphics driver in an instant.

The post How to reset your graphics driver on Windows appeared first on Trusted Reviews.

The Starlink Satellite kit is now half price at Currys

Starlink is a new revolutionary alternative to traditional methods of broadband, instead allowing you to start browsing online via a satellite connection. The price of entry has previously been a sticking point, but Currys has now slashed the price in half. 

The Starlink starter kit, which includes both the actual satellite and a router, is available for just £224.50 over on Currys. This is half the price of the original cost. 

Save over £200 on the Starlink Satellite router kit

Save over £200 on the Starlink Satellite router kit

The Starlink Satellite is your entry point to satellite-connected internet, allowing you to get an internet connection in the middle of nowhere, without the need of fibre optic cables. Currys has slashed the price of the satellite in half, making it available for just £224.50.

  • Currys
  • Save £224.50
  • Now £224.50
View Deal

The satellite supports an internet connection of up to 100 Mbps, while the dual-band router allows access to both the 2.4 GHz and 5 GHz bands. 

The best aspect of Starlink is that it doesn’t require cables for internet connection, which means – if you purchase the right service plan – you’ll be able to use the Starlink satellite to achieve an internet connection in remote locations where it normally wouldn’t be possible. 

It’s also useful for homes where you have limited speed options for fibre optic broadband. Starlink is available in a wide number of locations right now including North America, most of South America, United Kingdom, Western Europe, Japan, Australia and more. 

You will of course need to sign up to a service plan with a monthly fee in order to get internet access. Stalink currently offers a personal plan for £75 per month, roam plan from £85 per month and a boat plan from £247 per month. 

While Starlink certainly isn’t cheap, it’s still a very useful alternative to fibre optic, and the slashed price of the required satellite and router makes the entry point more accessible than before. 

So if you need a fast internet connection in a remote area, whether you live in the country or like to travel on the regular, then this discounted Starlink Satellite kit is the perfect jump pad to satellite-connected internet. 

The post The Starlink Satellite kit is now half price at Currys appeared first on Trusted Reviews.

Upgrade your Xbox controllers with this discounted rechargeable battery pack

Xbox owners will know the pain of seeing the low battery indicator for their game controller, yet have no batteries handy at home. The Xbox Play USB Charging Kit removes this pain point, and is available to buy for just £17.99 on Amazon. 

Amazon has slashed the price of the Xbox Play USB Charging Kit by 15%, taking the price down to a more affordable £17.99. That’s the cheapest price we’ve seen for it in a long time. 

Save 14% on the Xbox Play USB Charging Kit

Save 14% on the Xbox Play USB Charging Kit

The Xbox Play USB Charging Kit removes the need for batteries when using an Xbox controller, allowing you to charge it via a wired connection instead. The charging kit can currently be yours for just £17.99.

  • Amazon UK
  • Save 14%
  • Now £17.99
View Deal

The Xbox Play USB Charging Kit is a rechargeable battery pack, which can slot into your Xbox controller where the batteries normally reside. Bundled with a USB-C cable, you’re able to plug the controller into the console – or any other power source – to charge it up.

Handily, the Xbox Play USB Charging Kit allows you to charge up your Xbox controller both while you’re gaming and when it’s powered off – you can even use the Xbox console for charging while it’s in standby mode. This means you’ll never have to stop gaming again if you see the dreaded battery indicator flash up on your screen. 

This is an official product from Microsoft, which confirms that this rechargeable battery pack is compatible with both Xbox Series X/S and Xbox One consoles. Microsoft claims the battery pack can fully recharge within 4 hours.

The Xbox Play USB Charging Kit currently has an average customer rating of 4.4 out of 5 from 7,635 ratings on Amazon. One happy customer wrote: “The Xbox Play and Charge Kit is an essential accessory for any avid Xbox gamer who wants to enjoy uninterrupted gameplay without the hassle of disposable batteries. This rechargeable battery kit has truly revolutionised my gaming experience, and I can’t recommend it enough.”

So if you’re fed up with having to borrow your TV remote’s batteries every time your Xbox controller runs out of juice, make sure to take advantage of this great deal. 

The post Upgrade your Xbox controllers with this discounted rechargeable battery pack appeared first on Trusted Reviews.

❌
❌