الثلاثاء، 28 فبراير 2023

NEWS TECHNOLOGIE

RTX 3050 partner cards.

As 2023 plods along, more bad news for the PC hardware industry has arrived from Jon Peddie Research. The market analysis firm has tallied up shipments of CPUs as well as discrete GPUs for the fourth quarter of 2022, and it’s not pretty. The numbers haven’t been this bad for GPUs since 2011. In addition, the firm predicts the GPU market will grow very slowly over the next five years.

The banner news from the report is the massive drop in GPU shipments in the past year. That’s not just for discrete GPUs either, but for all GPUs. The report claims that overall shipments declined 38% YoY. Broken down, desktop GPU shipments declined by 24% and notebook GPU shipments fell by 43%. This led to small changes in market share for AMD, Intel, and Nvidia. Intel still holds the lion’s share of the GPU market thanks to the integrated graphics that ship in most of its CPUs. On a year-over-year basis, though, AMD has lost 6% of the market while Nvidia has lost 2%.

GPU market share for Q4. (Image: Jon Peddie Research)

The quarter was rough on all three companies when it comes to shipping GPUs to gamers. Nvidia’s shipments declined 11.7%, Intel’s numbers fell 16.5%, and AMD shipped 12.7% fewer GPUs. Overall, the companies shipped 15.3% fewer GPUs in what is typically a strong holiday season. It’s been previously reported that last year’s holiday sales were atypically bad, but it’s also not a surprise. The economy started to creak a bit in the months leading up to it, which fueled some economic anxiety that persists to this day. Peddie notes that Q4 is usually flat or up, but this year it was way down.

The news on the CPU front is also negative. CPU shipments decreased 17.4% from the previous quarter and 35.3% year-over-year. For context, 84 million CPUs were sold in Q4 of 2021. The following year only 54 million were shipped. The analysis shows that of all the CPUs in customers’ hands, almost two-thirds are in notebooks. The other third are desktop CPUs, highlighting the importance of mobile chips in overall sales.

Despite the doom and gloom, there were a few bright spots in the report. For example, shipments of add-in boards for desktops increased by 7.8% from the past quarter. The attachment rate for GPUs was also up 3% as well. Despite the relatively grim numbers, the few bright spots have prodded the Peddie group to remain optimistic. The report concludes by saying the sky is dark but not falling.

In general, the industry is expected to continue to be soft for the next quarter. It doesn’t seem to be a reflection of any one company’s products or strategy, either, as all companies have been affected. However, there’s plenty of uncertainty about when things will turn around. When Intel posted its quarterly earnings recently, it said it could only provide guidance for this quarter we are in now. Intel’s CEO said “macro weakness” could make for a rough first half of 2023, with the possibility of a rebound later this year.

Now read:



from ExtremeTechExtremeTech https://ift.tt/HKzSiyt

NEWS TECHNOLOGIE

Hacker attacking internet
(Credit: seksan Mongkhonkhamsao/Getty Images)
LastPass has been in the news a lot lately, and not because it’s the internet’s number one password manager, as it still proudly proclaims. The company is still reeling from a series of hacks last year that resulted in a trove of user data being stolen. This week, LastPass released new details of the attacks, explaining that the attacker targeted a senior LastPass engineer to gain access to the sensitive internal information that made the data theft possible.

Problems started for LastPass in August 2022 when LastPass notified its customers of a “security incident” involving proprietary company information. It said at the time that no user data was accessed, but in November, it announced a second attack that did target the passwords and other sensitive data people had stored on LastPass’ servers. The threat actor leveraged data stolen in the first phase of the attack in August, but how did they get that data in the first place? Well, it’s not pretty.

LastPass explains in the latest investigation update that the attackers targeted a senior engineer at the company, one of only four people with access to the LastPass corporate vault. The employee in question was working from home, and their employer did not enforce any access restrictions. The DevOps engineer was accessing sensitive company data using a personal computer, which also ran a “media software package.” Other sources claim the media software in question is Plex, which reported a data breach around the same time. Using an undocumented vulnerability in the media software, the attacker installed a keylogger and waited for the engineer to enter the master password and two-factor code.

(Credit: René Ramos; LastPass)

That operation gave the threat actor the keys to the kingdom; they obtained decryption keys for the company’s AWS-hosted backups, including critical databases and other resources. Because of the way LastPass had implemented access auditing, nothing seemed amiss at first. The company didn’t know about the second attack until Amazon alerted it to unusual activity on the account. The attacker made off with user password vaults that are only partially encrypted. The password data is secure, but the vaults include plain text URLs, emails, and IP addresses. The passwords are only protected by the user’s master password, which could be weak on older accounts.

In addition to the updated blog post, LastPass has published a rundown of all the data lost in the attacks. The company also provides a list of changes made to its security setup, but this is far from the first security issue for LastPass. It suffers a data breach of some sort almost every year, and it always says it has improved its security afterward. Perhaps LastPass, with millions of user passwords, is just too tempting a target. If you’ve got a LastPass account, it might be time to reevaluate.

Now read:



from ExtremeTechExtremeTech https://ift.tt/vfEBZXq

NEWS TECHNOLOGIE

(Credit: Xiaomi)
There’s a consensus amongst tech folks that augmented and virtual reality would catch on if the headsets were more comfortable. So far, all we’ve had, aside from a few flash-in-the-pan concepts, are big, bulky headsets. Nobody wants to wear those all day, nor could they, as battery life becomes an issue. The thinking goes that a pair of lightweight augmented reality (AR) glasses could be the next big thing in computing.

Google famously tried this with Google Glass. Now Xiaomi has officially unveiled the Wireless AR Glass Discovery Edition at Mobile World Congress in Barcelona. Although Xiaomi describes them as “lightweight yet sturdy,” they look pretty beefy. They weigh 4.4 ounces (126 grams), Engadget reports, which is way lighter than a virtual reality headset. For context, the all-new and quite slim HTC Vive Elite XR weighs 1.4 pounds (625 grams)–still a weight that would make them very noticeable.

Xiaomi’s glasses are packed with advanced technology. They sport two Micro OLED displays that Xiaomi says can hit 1,200nits of peak brightness. In front of those displays are “free form, light-guiding” prisms that offer “near retina” resolution. There are also electrochromic lenses in use that can change their tint. This allows for two modes; immersed or transparent. In Immersed mode, you’re shut off from the world as the glasses darken. In transparent mode, you can interact with your surroundings with AR gestures. The glasses are powered by the Snapdragon XR2 Gen 1 SoC, which also powers the Meta Quest Pro.

You’ll need a phone to do all of this, of course. Xiaomi says it syncs with the phone with a single tap using NFC. This is what allows it to be a wireless headset. The glasses also don’t have any storage, so they rely on the phone for that. The battery is a custom silicon-oxygen anode model, but it’s unclear how many hours of operation it provides. The weight of the lenses is offset by what sounds like very light frames. They’re made of a magnesium-lithium alloy structure and “carbon fiber materials.”

You can see the gestures in the video above, but they don’t look particularly useful. The demo says you can place virtual objects anywhere in your space. You can also drag and zoom freely using your fingers. Additionally, you can flip through a social media feed by flipping your fingers like you’re holding a phone. One useful feature is that you can drag what’s on TV into the glasses. However, it might be more high-res on your TV, but that’s just a guess.

There’s no word on whether these will ever go on sale or what they will cost. As a concept, it might never see the light of day. For any AR glasses to work, they need to be as powerful as possible, with all-day battery life, plenty of storage, and no lighter than a pair of reading glasses. Unfortunately, it could be at least a decade before we see something with those features. This is also not the first time Xiaomi has shown off a cutting-edge concept involving AR glasses. In 2021, it demoed a pair of very light AR glasses weighing only 1.8 ounces (51 grams). For most people, it’s not hard to see how compelling these devices could be someday. Sadly, that day is still far in the future.

Now read:



from ExtremeTechExtremeTech https://ift.tt/3UykzwN

NEWS TECHNOLOGIE

(Credit: Intel)
There’s been a lot of speculation about the fate of Intel’s 14th gen Meteor Lake architecture. The rumors have ranged from it being delayed, again, to it being mobile-only, to it being outright cancelled. Now a leaker is providing new details that indicate it might still be on track and will offer several new features. When it arrives and in what form are still open questions. The latest leak also links it to “Windows 12,” hinting at a 2024 launch.

The new details are provided by a Twitter user who has successfully provided accurate scoops in the past, according to Videocardz. Twitter account @leaf_hobby dishes on the upcoming Z890 chipset and what it will offer. The banner feature for Meteor Lake desktop is the addition of four more PCIe Gen 5 lanes, topping out at 20. Raptor Lake offers just 16 lanes. This will allow a dedicated x16 connection to the GPU. The additional four lanes are supposedly reserved for M.2 storage. There will also be four PCIe Gen 4 lanes to the CPU, which is unchanged from Z690/790. Another significant addition is 24 Gen 4 lanes to the chipset, four more than Z790.

This is the previous leak of Meteor Lake’s block diagram, which was thought to be its mobile version. (Credit: Igor’s Lab)

Other notable mentions include the introduction of Wi-Fi 7. This technology is already available in some routers, but we’re unaware of any client devices with it now. Everything is still Wi-Fi 6(e) and should remain that way for 2023. The tipster also notes their sources say it supports Windows 12. Microsoft previously showed off a new Windows Interface in 2022 at its Ignite conference, which led to rumors it was Windows 12. As we wrote at the time, that’s supposedly slated for 2024. Together, these features indicate a 2024 launch for Meteor Lake.

If that transpires, it will differ from Intel’s stated roadmap. Its CEO even said recently it’s on schedule and coming out in 2023. However, what seems more likely is the previously rumored Raptor Lake refresh for 2H2023. Either way, it certainly seems like it does exist still. It’s also possible these leaks are being fed to us to help the company disavow rumors of it being cancelled or delayed further.

This new leak confirms that Intel is reducing core counts for Meteor Lake. It was previously reported to offer 6P and 8E cores, two fewer performance cores than Raptor Lake. This configuration is what led to people guessing it’s for mobile. That is confirmed in this leak, though there’s also a 6P and 16E version. That’s still fewer cores than Raptor Lake, though, at least for performance cores since it offers eight.

We first saw Meteor Lake in the flesh way back in late 2021. Cnet took a tour of Fab 42 in Arizona and snapped some photos of it. Since then, there’s been a steady drumbeat of leaks and rumors of delays. We saw its block diagram for mobile last year. We also saw confirmation of its six performance cores last year from Intel directly. The company also showed it off at last year’s Vision conference. It is undoubtedly a real product that is very far along in its development. Whether that means a desktop version will come out in 2023, though, remains to be seen.

Now Read:

 



from ExtremeTechExtremeTech https://ift.tt/xmHV1Tg

NEWS TECHNOLOGIE

(Image: Adobe et al./MIT/Signal Kinetics)
Comic books and old sci-fi movies are rife with depictions of “X-ray vision” goggles—and now the MIT augmented reality (AR) lab is, too. Researchers at MIT have created a headset that enables the user to “see” through solid material and locate objects that would otherwise remain hidden.

The headset, called X-AR, relies on radio frequency (RF) signals to operate. RF signals are wireless electromagnetic signals most often used for communication, like in walkie-talkies, mobile phones, and your favorite radio station. Their ability to pass through solid material makes for an ideal locator—provided the hidden object possesses an RFID tag.

RFID tags reflect RF signals emitted by an RF antenna. As these reflections occur, MIT’s AR headset captures them and turns them into a virtual transparent sphere. The sphere tells the user where an item is, regardless of whether it’s sitting in a cardboard box, around a corner, or under a pile of other objects. Once the user picks up the item, the AR headset verifies that they’ve picked up the right thing.

(Image: Adobe et al./MIT/Signal Kinetics)

MIT associate professor Fadel Adib, who directs a wireless and sensor technologies group, led a team of research assistants and postdoc students in creating the headset. The team started with a Microsoft Hololens AR headset with an RF antenna. Then they programmed the antenna to use synthetic aperture radar (SAR). This technique enabled the antenna to measure the distance between itself and RFID-tagged objects. This technique proved highly effective thanks to humans’ free range of motion: Frequent movement provided the SAR antenna with multiple measurements, facilitating more accurate localization.

After the headset was complete, the team created a relatively simple AR interface to display item databases and locations. Users who put on the headset choose an RFID-tagged item from a menu. That’s when the headset’s antenna kicks in and provides the data necessary to give the item’s location a sphere for easy finding.

Adib’s team tested the headset in the lab and a simulated warehouse environment. In the warehouse, the headset averaged an item localization of 9.8 centimeters and verified that users had picked up the correct item with 96% accuracy.

As one can imagine, this isn’t just useful for a lighthearted game of hide-and-seek. Warehouse, retail, and factory workers could use the technology to quickly and easily find the necessary equipment, rather than opening and digging through bin after bin. There’s also a chance emergency services could also use it for search-and-rescue missions, but anything they’d hope to find under snow or rubble would have to contain an RFID tag.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/ZXAEfFB

الاثنين، 27 فبراير 2023

NEWS TECHNOLOGIE

(Credit: Eric Zeman)
OnePlus has a new concept smartphone on display at Mobile World Congress, and it’s got some cool glowing lines on the back. OnePlus says those luminous lines aren’t just to look cool — they’re part of the phone’s “Active CryoFlux” cooling system. It stands to reason you could make an active liquid cooling system for a phone without having it glow, but where’s the fun in that?

The OnePlus 11 Concept uses the same basic body style as the retail phone, but there’s a glowing ring around the enormous camera module, along with the aforementioned glowing lines across the back. It’s designed to be eye-catching, but the underlying technology is, according to OnePlus, capable of improving the smartphone experience.

Some high-performance smartphones have shipped with active cooling in the form of a spinning fan, but that adds noise, weight, and makes water resistance nigh impossible. The OnePlus 11 Concept uses an “industrial-grade piezoelectric ceramic micropump” to run coolant through a closed loop that pulls heat away from the Snapdragon 8 Gen 2 system-on-a-chip. The system apparently takes up less than 0.2 square centimeters of space, allowing it to fit inside the OnePlus 11 chassis. Although, it’s unclear if OnePlus had to change the internals or remove components to make the system fit.

Improved cooling isn’t just a gimmick — modern smartphone SoCs get toasty under load, and that causes them to throttle and lower performance. OnePlus says that Active CryoFlux cooling can lower temperatures by 2.1 degrees Celsius (3.78 degrees Fahrenheit) during games. That could allow the SoC to remain clocked higher and render 3 to 4 more frames per second. When charging, the phone could be 1.6 Celsius cooler (2.88 degrees Fahrenheit). OnePlus says that could cut 30 to 45 seconds from overall charging time, which doesn’t actually sound very impressive.

This device is only a concept, and OnePlus is fond of creating demo-worthy hardware, but most of it never finds its way into retail devices. Even if the company uses the Active CryoFlux system in future phones, it probably won’t glow like this. And anyway, the stock OnePlus 11 already has fantastic thermals. You can play games for as long as you want, and the “Cryo-velocity” cooling block inside ensures virtually no loss of performance. That’s not the case for a lot of other phones.

Now read:



from ExtremeTechExtremeTech https://ift.tt/gaI9KJz

NEWS TECHNOLOGIE

The day has finally arrived when we can see official, non-AMD benchmarks of its new Zen 4 flagship CPU. The Ryzen 9 7950X3D goes on sale on Feb. 28 and is the company’s first 32-thread V-Cache CPU. That’s exciting news for hardcore gamers with a big budget waiting to upgrade. However, it’s not a clean sweep over its rivals from Intel or competing chips from AMD, for that matter. Its $699 price tag is also tough to swallow, given its niche position in the market, as it’s not the fastest chip in productivity. It can top the charts for gaming, but it depends on the game. This puts it in a precarious position that might have AMD fans waiting for the less expensive 7800X3D to arrive in April.

As a quick recap, AMD is launching three Zen 4 V-Cache CPUs. The 16-core, 32-thread Ryzen 9 7950X3D is reviewed here and costs $699. There’s also a 12-core, 24-thread Ryzen 9 7900X3D that costs $599. It looks like AMD didn’t send that out to reviewers. Finally, the eight-core, 16-thread Ryzen 7 7800X3D costs $449. That’s coming out in April, and it seems like it’ll offer a tempting combo of price-to-performance–otherwise, AMD wouldn’t be holding it back for so long. However, these chips differ in the amount of overcall cache they offer: 144MB, 140MB, and 104MB, respectively. All three CPUs are 120W TDP CPUs too, which is a tad lower than their non-X counterparts. AMD had to drop the clocks a bit to compensate for the extra heat generated by the additional cache. They also have a lower maximum temperature of 89C compared with 95C for the previous CPUs.

Productivity Tests

 

Our colleagues at PCMag ran the numbers on this spicy bit of silicon. AMD previously marketed the Ryzen 7 5800X3D as purely a gaming CPU. The additional cache can offer a significant boost in certain types of games, so it wasn’t just marketing. With the Zen 4 versions of the Ryzen 9 chips, it’s taking a different approach by having two chiplets. One chiplet has the V-Cache on it, and the other doesn’t. Theoretically, this allows a best-of-both-worlds scenario. Apps that don’t benefit from the extra cache, like productivity apps, can run on the standard chiplet at high clocks. Games can run on the V-Cache die. However, as these tests show, the reduced clocks and operating temperature of this CPU slow it down in standard CPU benchmarks. The non-V-Cache 7950X beats it in every test except one, showing the extra cache isn’t helpful for these productivity tests.

Gaming Tests

 

Here’s where the rubber meets the road, and in the games used by PCMag, it’s not a decisive victory. The 7950X3D is at the top of the benchmark charts, but at $699, it needs more significant margins than it’s showing. The Core i9-13900k costs around $580, and the Ryzen 9 7950X is also around that price. Despite costing more than $100 more, it either loses to both of these CPUs or is very close, making it a tough sell. Overall there’s no knockout victory to be had here, as we saw with the Ryzen 7 5800X3D, which was the undisputed gaming champ.

Power and Thermals

 

 

This is one area where AMD’s newest CPU shines. It shows the benefits of the extra cache in that it can go toe-to-toe with Intel in benchmarks while using much less power. When running Adobe Premier and Cinebench, we see total system power at a shocking 130W less than the Core i9-13900K. It’s using 160W less than the flagship and notoriously power-hungry i9-13900KS too. For thermals, it’s sitting toasty at 90C despite being billed as an 89C max temp CPU. That’s hot but 10C less than its rivals from Intel. Overall AMD’s decision to lower TDP seems to have paid off.

Conclusion

The only portion of PCMag’s tests we didn’t analyze is a big win for AMD and its iGPU performance. For someone buying a $699 CPU, that will not matter, so we’re leaving it out. Overall, though, this is not the triumph people expected when AMD announced these CPUs. It didn’t mop the floor with Intel’s CPUs or non-V-Cache versions either. The lack of numbers for the 12- and eight-core versions might also give people pause. AMD certainly has a fast and efficient CPU on its hands, though. The price guarantees that only the most hardcore will spring for it, especially given the price of AM5 motherboards and DDR5 memory.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/R1b2UCP

NEWS TECHNOLOGIE

SpaceX surprised everyone with the rapid spread of Starlink internet access, but the company may have overextended itself. Speeds have been on the decline while prices keep going up, and the solution is more satellites. SpaceX has been making plans for its Gen 2 Starlink megaconstellation, and it’s showing off the “V2 Mini” satellites that will help it get there without waiting for the delayed Starship rocket.

Starlink plans to launch the first V2 Mini satellites later today (Feb. 27) aboard a good old-fashioned Falcon 9 rocket. This vehicle is still SpaceX’s workhorse despite years of work on the upcoming Starship. SpaceX CEO Elon Musk previously said that Starship would reach orbit in 2022, but the company is still performing engine tests on Starship and its Super Heavy first-stage booster. SpaceX is targeting the next few weeks for Starship’s first orbital test.

The V2 Mini satellites appear to be halfway between the old Starlink nodes and the upcoming V2 satellites. They will physically fit inside the Falcon 9’s hull, but the rocket will only be able to send 21 of them into space at a time. That’s a far cry from the 60-satellite payload for Starlink V1 missions. According to Teslarati, that means each V2 Mini satellite has a mass of about 1,830 pounds (830 kilograms).

The true Starlink V2 satellites are much larger than the current design, weighing up to two tons (4,400 pounds) each. These satellites will offer 10 times the bandwidth of the V1 satellites, and the FCC has granted SpaceX a license to deploy as many as 7,500 of them in new orbits. SpaceX previously added a few V1 satellites in these orbits, probably to conduct testing for the Gen 2 deployment. Clearly, SpaceX has decided that it needs to expand capacity now rather than waiting on Starship to launch V2 satellites.

The V2 Mini won’t have the capabilities of the full-scale version, but they make several important improvements over the V1 type. They have a larger, more powerful phased array antenna, along with two 52.5-square-meter solar panels. They will use the E-band for backhaul, allowing for up to four times higher network capacity. That will make up for the smaller number of satellites per launch. The V2 Mini also has a new Argon-based Hall thruster, replacing the more expensive Krypton-fueled engine from earlier Starlink hardware.

You can catch live coverage for the first V2 Mini below at 5 p.m. ET. The launch is slated for 6:13 p.m.

Now read:



from ExtremeTechExtremeTech https://ift.tt/IrMCoAL

NEWS TECHNOLOGIE

(Image: Christian Wiediger/Unsplash)
If you pay for YouTube Premium, you may have recently come across a new playback option in the bottom-right corner of your screen. YouTube is sneakily testing “1080p Premium,” a video quality option that streams at an enhanced bitrate.

YouTube videos play at 1080p (1,920 pixels horizontally, 1,080 vertically) by default. Free and paid users can adjust playback down to 144p, 240p, 360p, 480p, and 720p60 (720p with 60Hz frame rate), depending on the video they’re watching. YouTube has also added higher-definition options over the last decade: 1080p HD, 1440p HD, and 4K. Together, these choices allow users to tailor their viewing experience based on the device they’re watching on, their internet speed, their comfort, and more.

Over the last few days, Premium users have reported seeing a new 1080p Premium option in the video quality menu. YouTube spokesperson Paul Pennington told The Verge that “a small group of YouTube Premium subscribers” can try out the feature, which purports to offer “an enhanced bitrate version of 1080p which provides more information per pixel.” This, Pennington said, creates a higher-quality viewing experience.

Bitrate, or the measurement of how much video data is transferred within a specific timeframe, is often said to be half of what makes up a visually appealing playback experience. YouTube’s standard 1080p option typically plays at eight to 10 megabits per second (Mbps). While YouTube hasn’t said what Mbps 1080p Premium would provide, a Reddit user who took advantage of YouTube’s “Stats for Nerds” tool found the playback mode to run at about 13Mbps—a marked improvement from the standard option, given the massive difference just 3Mbps can provide.

With the news of 1080p Premium, some non-Premium users have expressed concern that YouTube is ditching the standard 1080p option to tempt free users toward its paid subscription. YouTube insists this isn’t the case. According to Pennington, “there are no changes to the existing quality offerings for 1080p (HD) resolution on YouTube.” It’s also important to remember that 1080p Premium is in testing; there’s no guarantee YouTube will roll it out to all Premium subscribers.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/nU5XxwS

NEWS TECHNOLOGIE

Microsoft is reportedly ramping up its efforts to annoy people running Windows 11 on unsupported hardware. The company is adding a watermark to users’ desktops, admonishing them for not meeting Windows 11 requirements. This has long been a huge roadblock for Windows 11 adoption, as it’s only supported on newer PCs. The biggest sticking point is the requirement of a Trusted Platform Module 2.0, which only comes on CPUs made in the past few years. It also has to be enabled in the BIOS, so a newer motherboard is needed too. However, there’s always been workarounds for older PC running Windows 11. Now Microsoft is upping the stakes by trying to bug users of these machines via desktop messages. You can turn it off if you’re comfortable editing the Windows Registry.

It seems Microsoft recently began pushing this message to users that are deploying TPM bypass or another workaround. A user on Twitter flagged it recently as appearing unexpectedly. That user was indeed using the TPM bypass, as their laptop is 15 years old. The blunt message states, “System requirements not met. Go to Settings to learn more.” To be fair to Microsoft, if a system doesn’t support Windows 11, a user might want to know how to get rid of this message. However, according to HotHardware, there’s not much to be gleaned from learning more in Settings. Instead, it says you “might want to consider purchasing a new PC.” Gee, thanks for the advice Microsoft.

If you’re seeing this message and have decided not to buy a new PC, here’s how to get rid of the message. We have to insert the usual disclaimer here about the dangers of editing the registry. This is the underbelly of the entire operating system, so proceed with caution. It’s also a good idea to create a System Restore Point before you do it. That will save a copy of the current registry in case things go sideways. With that in mind, here’s what you need to do:

  • Click on the Start menu and type Registry Editor and open it
  • Expand HKEY_CURRENT_USER
  • Navigate to and expand Control Panel
  • Click the UnsupportedHardwareNotificationCache folder
  • In the accompanying window pane, right-click the SV2 entry and select Modify
  • Change the value to “0” then click OK
  • Reboot your PC

To recap, the official requirements for Windows 11 are as follows: You’ll need an 8th Gen Intel CPU or second-gen AMD Ryzen CPU, at least 4GB of memory, 64GB of storage, and a motherboard that supports UEFI and Secure Boot. Your graphics card has to support DirectX 12. Finally, you’ll need a 9-inch display that supports 720p resolution.

Now read:



from ExtremeTechExtremeTech https://ift.tt/a9AMJCT

السبت، 25 فبراير 2023

NEWS TECHNOLOGIE

Good afternoon, readers, and welcome to This Week in Space: Your weekly roundup of news from here to the big empty. This week, a lot is going on with SpaceX and the International Space Station.  Today we’ll hear those updates, plus a colossal solar flare, ‘beneficiated regolith,’ and a black hole that decided three’s a crowd.

For our fellow sci-fi nerds: This week also marks the 30th anniversary of the beloved sci-fi TV series Babylon 5. The show’s pilot aired on Feb. 23, 1993. Unfortunately, the recent reboot attempt (with J. Michael Straczynski at the helm!) has run aground upon the shoals of studio funding. Even so, we’ll be busting out the DVD box set this weekend.

Russia Launches Replacement Soyuz Capsule to International Space Station

Last December, a micrometeoroid or orbital debris (MMOD) punched a hole in Russia’s Soyuz MS-22 capsule, leaving two cosmonauts and an astronaut without an easy way home. A replacement Soyuz capsule, MS-23, took off for the International Space Station last night (Thursday, Feb. 23) at 7:24 p.m. EST from the Baikonur Cosmodrome in Kazakhstan. According to a NASA blog update, the uncrewed Soyuz spacecraft is safely in orbit headed for the ISS.

After a two-day journey, MS-23 will dock with the station’s Poisk module at 8:01 p.m. tomorrow (Saturday, Feb. 25). This new Soyuz will replace the Soyuz MS-22 spacecraft. NASA astronaut Frank Rubio and Roscosmos cosmonauts Sergey Prokopyev and Dmitri Petelin would have returned on the Soyuz MS-22, but that MMOD strike wrecked the capsule’s coolant loop. The three crew members will return to Earth on the new Soyuz MS-23 later this year.

The damaged Soyuz MS-22 is scheduled to undock from the station in late March. When it does, it will parachute back to Earth, landing somewhere in Kazakhstan for post-flight analysis by Roscosmos.

…the MS-23 capsule will carry home the astronaut and cosmonauts currently semi-stranded in orbit.

Russia accelerated the launch of the Soyuz MS-23 spacecraft to the ISS to Thursday from a shifting target date somewhere in March. MS-23 was supposed to be just another routine crew rotation for personnel on the space station.  However, the December coolant leak — and another coolant leak, this one from the Progress-82 cargo capsule docked at the station — set in motion a series of changes to the plan.

MS-23 was supposed to launch later this spring with three crew members on board. However, the MS-22 leak wrecked that schedule and left Prokopyev, Petelin, and Rubio looking for a different ride. Roscosmos determined that the damaged Soyuz could maybe carry two people safely home in the event of some emergency that demanded swift evacuation. In such a situation, Prokopyev and Petelin will entrust their lives to the MS-22 capsule. Rubio would take shelter aboard the four-person SpaceX Crew-5 capsule.

…meanwhile, SpaceX delayed its Crew-6 launch until Monday.

The SpaceX Falcon 9 rocket that will carry Crew-6 to the International Space Station is upright on its launch pad at NASA’s Kennedy Space Center in preparation for its scheduled Monday launch.

(Credit: NASA/Joel Kowsky, via NASA HQ Flickr)

“An integrated static fire test and dry dress rehearsal with the crew will occur prior to liftoff,” NASA officials wrote in a blog post. If those tests go well, the rocket will take off Monday morning (Feb. 27) at 1:45 a.m. EST (0645 GMT) from Launch Complex 39A at NASA’s Kennedy Space Center.

Crew-6 is a four-person team, including the first United Arab Emirates astronaut to perform a long-duration mission, Sultan Al-Neyadi; NASA astronauts Warren “Woody” Hoburg and Stephen Bowen; and cosmonaut Andrey Fedyaev of Russian space agency Roscosmos.

According to NASA, this mission will be the fourth spaceflight for Bowen, who flew several space shuttle missions between 2008 and 2011. For Hoburg, Alneyadi, and Fedyaev, Crew-6 will be their first spaceflight. The quartet will spend up to six months in microgravity before returning to Earth.

…SpaceX also postponed its Starlink launch, giving priority to the Crew-6 mission.

SpaceX confirmed Thursday that the next batch of Starlink internet satellites will now launch no earlier than Sunday.

The mission, Starlink 6-1, will put another group of SpaceX internet satellites into low-Earth orbit. (Trivia: Their 43-degree orbital inclination with respect to the equator will leave the satellites moving in a nearly perfect sine wave.) It was originally scheduled to launch yesterday from pad 40 at Canaveral aboard a Falcon 9 rocket. However, SpaceX and NASA officials said Wednesday that the Starlink 6-1 mission would be delayed from Thursday to “no earlier than Sunday.” Meanwhile, mission personnel are getting the Crew-6 Falcon 9 rocket ready for takeoff at the KSC.

Calling Ham Radio Operators: NASA Wants Your Help

You all know by now how much we love citizen science. An ‘amateur’ astronomer was instrumental in finding a recent impact that taught us much about what lies beneath Jupiter’s mysterious surface. And NASA also recognizes the great value of citizen scientists. In its most recent outreach, the agency has called on ham radio operators to help study upcoming solar eclipses in 2023 and 2024. From their site:

[Long-distance ham radio] communication is possible due to interactions between our Sun and the ionosphere, the ionized region of the Earth’s atmosphere located roughly 80 to 1000 km overhead. The upcoming eclipses (Oct. 14, 2023, and April 8, 2024) provide unique opportunities to study these interactions. As you and other HamSCI members transmit, receive, and record signals across the radio spectrum during the eclipse, you will create valuable data to test computer models of the ionosphere.

NASA plans to include measurements of the ionosphere and signal-spotting challenges in the eclipse events. Since the nearest eclipse is in October, you’ll have time to get your setup in order. This might also be a great way to bring hands-on science into the classroom once the school year starts. For more information, check out the solar eclipse page on HamSCI.org.

European Space Agency Launches Lunar Farming Study

If humans want to establish a long-term presence on the Moon, we’ll need to figure out how to get food once we’re there. In pursuit of that idea, the European Space Agency (ESA) announced this week that it has begun a year-long study on what it will take to create flourishing farms on the Moon.

My colleague Adrianna Nine writes that the project, “Enabling Lunar In-Situ Agriculture by Producing Fertilizer from Beneficiated Regolith,” will study various ways of extracting minerals from lunar soil for hydroponic farming. Lunar regolith has lots of nutrients, it turns out. Alas, it doesn’t have one crucial component Earthly plants all seem to need — nitrogen.

NASA Working to Develop a Battery That Could Survive on Venus

Venus is a deeply hostile place. Landers we try to drop onto its surface melt into slag, sometimes within minutes. However, NASA is working with a private company to develop a power system that will live on the planet’s oppressive surface for an incredible sixty days. However, not just any power system will do. My colleague Ryan Whitwam points out, “it has to be a battery. You can’t use solar panels due to the planet’s thick atmosphere. A radioisotope thermoelectric generator (RTG) like the one used in the Perseverance rover would produce heat, and Venus is already hot enough to threaten the 22-pound lander.”

Test battery hardware: High-temperature thermal batteries adapted for the Venus surface. (Credit: Dr. Michael Barclay, Advanced Thermal Batteries)

The heat-loving battery is based on the short-lived thermal batteries used in smart missiles, and this technology could be ideal for use on Venus. The 17-cell battery developed by ATB uses a special high-temperature electrolyte that is solid and inert at normal Earth temperatures. When heated to high temperatures, the battery instantly provides high power output, and there’s plenty of heat to spare on Venus.

The Worst Pile-Up Ever: Black Holes Discovered On Collision Course

Make a list of the most destructive phenomena in the universe, and black holes are likely to make an appearance near the top. We’ve learned a great deal about them in recent years, but they still occasionally delight in telling our understanding of physics to go lay an egg. Case in point: According to data from the Chandra Observatory, two black holes in distant dwarf galaxies are on a collision course with each other.

(Credit: X-ray: NASA/CXC/Univ. of Alabama/M. Micic et al.; Optical: International Gemini Observatory/NOIRLab/NSF/AURA)

Many, if not most, galaxies have a supermassive black hole at their core. However, the dwarf galaxies we can see haven’t had black holes at their center — at least, not until now. Researchers have now found two sets of black holes in dwarf galaxies on collision courses, implying the scenario may be more common than we have observed to date.

“Astronomers have found many examples of black holes on collision courses in large galaxies that are relatively close by,” said Marko Micic of the University of Alabama at Tuscaloosa, who led the study. “But searches for them in dwarf galaxies are much more challenging and until now had failed.”

The researchers overcame the intrinsic difficulty of imaging these far-flung galaxies by combining optical observations from the Canada-France Hawaii telescope with data from the Chandra X-ray observatory and NASA’s Wide Infrared Survey Explorer. The two pairs of colliding black holes are located 760 million and 3.2 billion light-years from Earth, respectively. Both are in the process of merging. Astronomers believe that larger galaxies like the Milky Way formed through the collision of dwarf galaxies, which means we may be watching the same type of merger that gave rise to the Milky Way and its ecliptic: the backbone of night.

Webb Telescope Spies a Legion of Tiny Stars Hubble Couldn’t See

It’s easy to crow about the outstanding clarity of the James Webb space telescope’s vision. I myself have declared the JWST the victor over the Spitzer and Hubble space telescopes, comparing what they see when they look at the same part of the sky. But there’s a use to the comparison beyond gloating. In a recent NASA interview, astronomers tell how they used just such a comparison to find a huge population of cool, dim, low-mass stars with the JWST that Hubble couldn’t detect.

The astronomers explained that Webb is well-suited to finding very cool, low-mass stars of less than 0.1 solar masses. That’s very close to the threshold of mass beyond which a celestial body usually ignites fusion and begins to radiate light as a star. With Hubble, their faint points of light were lost in the noise. However, these small stars are apparently the most numerous in the universe. What we learn from them with the JWST has implications for our understanding of cosmic history.

Scientists Find Gargantuan “Runaway” Supermassive Black Hole

Looking elsewhere in the sky with Hubble, astronomers have spotted a “runaway” supermassive black hole with its own stellar entourage. It appears that the black hole was ejected from its home galaxy. It’s now racing away at ludicrous speed, with dozens of stars trailing in its wake as if it were some great celestial Pied Piper.

Artist’s impression of the runaway black hole as it streaks away from its home galaxy. (Credit: Keio University, via LiveScience)

In a paper recently accepted for publication in The Astrophysical Journal Letters, astronomers present the observation as the first direct evidence that supermassive black holes can be flung from their home galaxies by some titanic force and cast into interstellar space.

The researchers discovered the runaway black hole by the light of its “entourage.” Black holes are invisible against the pure black of the deep sky. However, the researchers spotted a brilliant and unexpected streak of light while using the Hubble Space Telescope to observe the dwarf galaxy RCP 28, which lies about 7.5 billion light-years from Earth.

The next step is to figure out how this could possibly have happened. Pieter van Dokkum, the lead author of the paper, believes that a “gravitational slingshot” threw the black hole out into deep space. “The most likely scenario that explains everything we’ve seen is a slingshot, caused by a three-body interaction,” van Dokkum said. “When three similar-mass bodies gravitationally interact, the interaction does not lead to a stable configuration but usually to the formation of a binary and the ejection of the third body.”

The report is currently available on arXiv, while it awaits peer review and publication.

Skywatchers Corner

Normally we don’t report on solar flares unless they’re big enough to cause significant problems here on Earth. This was the case with a powerful X-class solar flare that released a shockwave on the surface of the Sun last weekend. The shockwave created a “solar tsunami” that experts believe approached sixty thousand miles in height. But the event occurred at a place on the Sun that left the outbound coronal mass ejection (CME) perfectly pointed at the Earth.

NASA’s Solar Dynamics Observatory caught the flare as it happened. (Credit: NASA Solar Dynamics Observatory, via SpaceWeather.com)

The CME “only” scraped past the Earth (instead of slamming into us head-on). However, it was still enough to create a geomagnetic storm that caused serious short-wave radio blackouts Wednesday across a huge swathe of the planet, including the southwestern US. The ion storm also lit up the night with a display of the aurora borealis visible as far south as Michigan and New York.

Amateur radio astronomer Thomas Ashcraft had his telescope pointing right at the sun when the flare let loose — and he caught a recording of the roar of radio noise when the flare hit our atmosphere. “The sun was right in my radio telescope beam when the flare occurred,” said Ashcraft, “and my spectrograph captured the full force of the resulting radio burst.”

Elsewhere in the sky — Monday night (Feb. 27), look to the southwest shortly after sunset. The Moon and Mars are in conjunction, and they will appear less than a degree apart.

Feature image credit: NASA/Joel Kowsky, via NASA HQ Flickr

Now Read:



from ExtremeTechExtremeTech https://ift.tt/K8LZcqy

الجمعة، 24 فبراير 2023

NEWS TECHNOLOGIE

A team of astronomers has used some of the most powerful telescopes on (and orbiting) Earth to make a first-of-its-kind observation. They’ve spotted a pair of dwarf galaxies containing supermassive black holes on a collision course. Then, they did it again. Yes, two pairs of colliding dwarf galaxies, both extremely distant but at different phases of merging. Scientists hope this discovery can help shed light on how large galaxies like the Milky Way came to be.

A dwarf galaxy is one with less than 3 billion solar masses, about one-twentieth of the Milky Way’s mass. There are plenty of those drifting around in our corner of the universe, including several orbiting the Milky Way. None of them are about to collide, though. The current scientific consensus holds that the large galaxies common today arose from the merger of smaller ones, but ancient galactic mergers are difficult to observe because of their distance. The team, led by astrophysicist Marko Micic from the University of Alabama, got around that by combining data from the Chandra X-ray Observatory, infrared observations from NASA’s Wide Infrared Survey Explorer (WISE), and optical data from the Canada-France-Hawaii Telescope (CFHT).

The orbiting Chandra observatory was particularly valuable in this study as it revealed the black holes, which are becoming more active as the galaxies move toward each other. The disk of super-heated material spiraling toward the event horizon heats up to millions of degrees, causing it to emit X-ray energy. This data pinpointed the colliding dwarf galaxies, and the visible and infrared data showed how the galaxies are interacting as they approach each other.

In the image above, the left composite shows a merger in galaxy cluster Abell 133, about 760 million light-years from Earth. Since the galaxies are in the late stages of merging, it’s been given a single name: Mirabilis, after an endangered species of hummingbird. On the right are the dwarf galaxies Elstir and Vinteuil, which are a reference to the Proust novel “In Search of Lost Time.” These galaxies are still separate, but a bridge of stars and gas has started to form between them. These objects are located in the Abell 1758S cluster some 3.2 billion light-years away.

With these objects identified, astronomers will be able to conduct follow-up observations that will reveal the processes taking place as they merge. In time, both pairs of galaxies will become larger dwarf galaxies with even bigger central black holes. This could cause more draw more small galaxies toward them, resulting in more mergers. Given the time scales involved, we can’t just stare at Elstir and Vinteuil until that happens, but astronomers have other sources of data. The James Webb Space Telescope recently revealed the Sparkler Galaxy, which appears to be a mirror image of a young Milky Way, studded with glowing globular clusters. This, too, could offer insight into how our galaxy evolved over the eons.

Now read:



from ExtremeTechExtremeTech https://ift.tt/IcFJz9y

NEWS TECHNOLOGIE

Intel CEO Pat Gelsinger holds an 18A SRAM wafer. (Credit: Intel)

The last few weeks have been a rocky ride for Intel and its CEO Pat Gelsinger. The company posted brutal year-end earnings that shocked the semiconductor industry. This resulted in Intel announcing it was cutting executive salaries and reducing incentives such as 401K matching for all employees. Gelsinger himself took a 25% pay cut as well. On top of that bad news avalanche came an even worse prediction from sources in Taiwan. A report from DigiTimes said Intel was delaying its TSMC order of 3nm tiles destined for its 15th generation Arrow Lake architecture. The delay would bump its arrival into 2025 and mark a major stumble for Intel. Now its CEO has rebuked those claims in no uncertain terms, saying they’re “patently false.”

Intel’s CEO made the comments in a Capital Allocation Update conference call this week with analysts. Tom’s Hardware captured highlights, including Intel saying it was slashing its quarterly dividend by two-thirds. This will allow the company to save up to $3 billion in 2023. On the call, Gelsinger also addressed the recent report about delays for Arrow Lake, the architecture that will come after Meteor Lake, which is due in 2023. Arrow Lake is scheduled to follow in 2024; the DigiTimes report said it was being pushed to 2025.

Intel will transition from monolithic to tile-based designs starting in 2023 with Meteor Lake. (Image: Intel)

“I am somewhat amazed by some of these rumor mill discussions that come out,” said Gelsinger. “You might notice there were similar ones on Intel 4 a few months ago and also with some of our other TSMC programs, which were patently false at the time as well.”

As someone who has written a dozen or so articles about rumored delays at Intel, this is the first time we can recall him commenting specifically. It’s possible this particular rumor was too egregious for him to stay quiet. It’s also possible he felt the need to get out in front of this one due to the recent maelstrom.

“The 3nm programs are on track, both with TSMC as well as our internal Intel 3 programs Granite Rapids and Sierra Forest in particular,” said Gelsinger. Intel 3 is a refined version of its Intel 4 architecture, formerly known as 5nm. This is also the first time we can recall Pat commenting on anything related to its relationship with TSMC. That partnership has also been the source of rumored delays as well.

For example, it was previously speculated Intel was dropping the TSMC 3nm iGPU tile from Meteor Lake. It was assumed the company would go with a 5nm part instead. With the news this week that Apple had purchased all of TSMC’s initial 3nm wafers, it’ll be interesting to see what ends up in Meteor Lake. It’s possible Intel could use the refined N3E version of TSMC’s 3nm node for Meteor Lake in late 2023. On this same call, Gelsinger affirmed Meteor Lake was still on schedule.

Gelsinger concluded by saying that Intel’s execution will put all the rumors to rest. “You know, these rumors, like many others, will be proven by our execution to be firmly false,” he said. As always, we will have to wait and see if that is true. Something tells your humble author we will be reexamining these comments in the not-too-distant future.

Now Read:

 



from ExtremeTechExtremeTech https://ift.tt/j3apy98

NEWS TECHNOLOGIE

We’ve heard about quantum computers for years, but no one has made one better at crunching numbers than a binary machine. Google’s Quantum Engineering team may be on the right track, though. For the first time, the team built a larger quantum computer that didn’t become less accurate. In a few years, we might consider this a significant turning point in the quest to make quantum computers useful.

The promise of a practical quantum computer is alluring — a bit of quantum information (a qubit) can encode more data than just 0 or 1 like a traditional binary computer. That means a quantum computer can, in theory, be much more powerful. However, qubits are sensitive to interference, even from light and temperature variations. This leads to higher error rates that make the output of quantum computers untrustworthy. And the more qubits you add, the higher the error rate.

However, Google’s quantum engineering team says there may be light at the end of the tunnel. In the latest experiment, engineers have used quantum error correction to reduce the error rate while also making the quantum array larger. Google’s engineers group individual qubits in arrays of 49 to form a single logical qubit. In the past, Google worked with groups of 17 qubits, but the new 49-qubit design demonstrated a lower error rate.

According to Google, this is the first time anyone has scaled a logical qubit without increasing the error rate. This could be an important milestone on the way to a practical quantum computer. Google cites potential use cases like modeling new molecules for medical uses, refining battery technology, and designing power-generating fusion reactors.

Creating a larger logical qubit with a lower error rate is a big step in that direction, but the hardware and software that goes into quantum computing must first improve. Google is looking toward upgrades in control electronics and cryogenics to move us in the right direction, and the materials that go into the company’s Sycamore 2 quantum chips will be refined. That could get us to a place where quantum computing has real-world uses, and Google says it’s already planning for that day. The company is working with government agencies and the larger security community to ensure that internet traffic and Google’s cloud services remain secure in a world of robust, scalable quantum computers.

Now read:



from ExtremeTechExtremeTech https://ift.tt/6YCsKVi

NEWS TECHNOLOGIE

(Image: Gilles Lambert/Unsplash)
Banks are continuously coming up with new ways to verify customers’ identities. Before biometrics allowed people to access their accounts from home, banks required customers to show their driver’s licenses or sign a paper or digital pad to compare the signature with the one on file. Now most of us can access our balances, make transfers, and complete other virtual banking tasks with a quick Face ID scan, a fingerprint scan, or by speaking a specific phrase out loud.

One little problem: The latter option is easy to exploit. Vice Motherboard’s Joseph Cox successfully “broke into” his own bank account this week using a readily available AI voice generator, which mimicked Cox’s voice well enough that the bank’s voice-based biometric security system didn’t raise a red flag.

Cox started with a free voice creation service from Eleven Labs, an AI company founded by a former Googler focusing on “realistic and versatile” speech. He recorded five minutes of his own speech and uploaded it to Eleven Labs’ software, which used the recordings to create a synthetic voice. Cox could type in whatever he wanted the synthetic voice to say, then download those snippets as audio files. Among the phrases he procured were “Check my balance” and “My voice is my password.”

Cox called his bank—Lloyds Bank in the United Kingdom—and played the files corresponding with these phrases as the automated teller walked him through its security checks. To Cox’s disbelief, the system perceived the synthetic voice as his own and allowed him access to his balance, recent transfers, and other information.

Lloyds Bank is far from the only bank to use voice verification. (Image: Nick Sarvari/Unsplash)

“Some banks tout voice identification as equivalent to a fingerprint, a secure and convenient way for users to interact with their bank,” Cox writes in his description of the hack. “But this experiment shatters the idea that voice-based biometric security provides foolproof protection in a world where anyone can now generate synthetic voices for cheap or sometimes at no cost.”

A glaring vulnerability like this doesn’t just spell danger for unsuspecting randoms; it also presents real risks for victims of financial abuse, who’d otherwise keep balances and transactions private from toxic partners. Cox notes that while bad actors would need a victim’s date of birth to break into an account (he had to type his own into his phone), such information is readily available online, thanks to social media, data breaches, and data brokers.

Between Wells Fargo’s “Voice Verification” system and Chase’s “Voice ID,” large banks tend to assure their customers that voice-based security systems are secure and effective. When Cox informed his bank that he’d been able to access his account with a synthetic voice, a spokesperson responded that Lloyds Bank’s system provided “higher levels of security than traditional knowledge-based authentication methods.” Wells Fargo and Chase didn’t respond to related inquiries.

At that point, there’s only one small comfort: Voice verification is usually an option that can be disabled, much like Face ID. With the proliferation of AI-based imitation technology as of late, it might be best to turn that feature off.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/SymBTCr

NEWS TECHNOLOGIE

Half-Life was Valve’s first major game release, and it catapulted the company to the top of the gaming industry. Today, Valve runs the Steam gaming platform where you can purchase all its titles, as well as games from almost every other publisher. And you can still play the 1998 version of Half-Life if you so choose. This game just got a makeover that could make that proposition more appealing, but it’s not Valve’s doing. Modder Sultim Tsyrendashiev has completed work on a ray tracing plug-in for Half-Life, bringing the best in modern lighting to the 25-year-old game.

Ray tracing started appearing in games several years ago. The technology allows graphics hardware to simulate the physical properties of light to generate realistic illumination, shadows, and reflections. This is a computationally expensive process, so video cards were not powerful enough to do it until the last several generations, and AMD is still way behind the curve. Meanwhile, Sony and Microsoft added ray tracing support to their current-gen consoles.

Half-Life 1 predates ray tracing by a few decades, but Tsyrendashiev has succeeded in updating the lighting, just as he did for Doom, Quake, and other classic games in previous projects. To peek at Half-Life 1 with ray tracing, you’ll need a copy of the original game, available on Steam for $10.

Tsyrendashiev provides instructions on GitHub to get the mod running, but it doesn’t look difficult. All you need to do is unzip the mod files (about 130MB total) and copy them into the Half-Life folder. The textures, character models, and animations might still look like something out of the late 90s, but the lighting is a night and day difference. You can even see how the new lighting changes the experience by toggling back and forth — just press x to switch.

The mod developer recommends using an Nvidia RTX card because of AMD’s lagging ray tracing performance. Additionally, you may want to avoid custom maps as they could render with unplayably dim lighting. There are also some quirks due to how Half-Life uses a local server to send objects. The renderer also doesn’t support dynamic light maps, so there’s no ray-traced flashlight beam. Even without that, the game looks dramatically better with modern lighting.

Now read:



from ExtremeTechExtremeTech https://ift.tt/pUeCXt3