الاثنين، 31 أكتوبر 2022

NEWS TECHNOLOGIE

(Photo: Shift Robotics)
What if you could walk really fast? Okay, not Sonic the Hedgehog fast…and not Quicksilver fast, either. Think more along the lines of nearly three times faster than you already cruise through the park or through a dying shopping mall.

Moonwalkers claim to help you accomplish that. Designed by a team of Carnegie Mellon University engineers who banded together to found Shift Robotics, these battery-powered attachments strap onto almost any pair of shoes to give you an enviable speed boost. Instead of free-wheeling like roller skates, Moonwalkers’ eight polyurethane wheels work with a set of built-in sensors to switch between “lock” and “shift” modes, which prevent the wheels from spinning when the wearer is navigating stairs, using public transit, or otherwise requiring full motion control. These modes also help the wearer stop within one meter even at top speed, which is said to be 250 percent faster than the wearer’s normal walking speed.

The attachments’ chassis are made entirely from aluminum to prevent crushing and assist in thermal management. The 300-watt electric motor, which powers the wheels for up to six miles of active use per charge, is fully sealed to protect against water and debris ingress. According to Shift, this is what allows Moonwalkers to navigate puddles and sidewalks that are in less-than-perfect condition. Because Moonwalkers are designed to match the wearer’s gait, there’s said to be zero learning curve, which can’t be said for conventional equipment like roller skates and rollerblades.

It’s hard to avoid wondering if Moonwalkers are a solution to a problem that doesn’t exist. After all, a full range of motion is still required to use them; they don’t serve as a mobility aid for those who can’t already walk, and since they use wheels, Moonwalkers are rendered pointless on most unpaved surfaces. (You certainly can’t bring them on a hike or to the beach, where walking is arguably more exhausting.) Shift Robotics seems to be positioning its attachments as a way to make city life a bit more efficient: “With Moonwalkers, you can pick up your dry cleaning across town, carry those grocery bags a little easier, grab those last-minute dinner items much quicker, or whatever else with much more ease,” its Kickstarter page reads.

But at $799 to $1,299 per pair (depending on the Kickstarter campaign’s progress), the cost of that added efficiency is pretty steep. This means Moonwalkers’ target audience is quite small: Frugal budgeters, rural dwellers, and those who like to stop and smell the roses need not back this project.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/QqzTlet

NEWS TECHNOLOGIE

It’s been a time of glorious bounty in the PC hardware universe unlike any other. We’ve seen powerful new CPU architectures from both Intel and AMD launched within weeks of each other. We also witnessed the unabashedly potent RTX 40-series launch. It’s been an exciting time to be a PC gamer.

Now there’s just one more highly anticipated arrival that will effectively wrap up the party: AMD’s next-gen GPUs. These will truly be something special: the first chiplet-based GPUs for consumers. The 7900X was an obvious product, but the existence of a second, more-powerful GPU is the big news here. It seems AMD is following Nvidia’s strategy of launching two high-end cards first. As you recall, Nvidia “unlaunched” the third 40-series GPU previously.

The latest rumors indicate AMD will be launching the RX 7900 XTX and the RX 7900 XT this week. Yes, that is a lot of Xs. It’s also surprisingly bringing back the XTX moniker, which it hasn’t used since 2006 for the X1900 XTX. Both GPUs will seemingly take on the RTX 4090 at the top of the stack for RDNA3. Both GPUs will be Navi31, which is the big die of the family. The XTX version will be the full version of the chip, with the XT version being slightly cut-down. The flagship should come with 24GB of 20Gb/s memory on a 384-bit memory bus, which aligns with previous rumors. This should give it almost a terabyte of memory bandwidth. That number doesn’t include the benefits of its infinity cache.

This fan-made render of the flagship Navi31 GPU looks tantalizing. (Image: @Technetium)

It’s expected to offer up to 96MB of cache this time, according to TechSpot. That’s a small reduction from the 128MB it offered on the 6900 XT. However, it could be using its V-Cache technology to vertically stack some of it too. It was previously rumored it might offer as much as 384MB, which would be truly nuts. The XTX card is reported to boast 12,288 streaming processors. That’s more than double the 5,120 found in the previous flagship; the Radeon RX 6900 XT. Its power consumption is still a big unknown here. Previously AMD stated it’ll be going up this generation. Since the 6900 XT was a 300W card, we can comfortably predict it’ll be 375W or so.

The cut-down XT version will use 20GB of GDDR6 memory across a narrower 320-bit bus. It’ll allegedly offer 800GB of memory bandwidth, which is more than the RTX 4080 16GB, not accounting for infinity cache benefits. It’s possible this card could land in between Nvidia’s top GPUs though. It’ll offer 10,752 stream processors, more than double that of the 6800 XT’s 4,680.

Notably, neither GPU will be using the newfangled 12VHPWR connector. Instead, it’s using the tried-and-true eight- and six-pin connectors we all have now. This will come as good news for those who have been following GPU news. Nvidia is currently embroiled in a controversy concerning its 4-into-1 PCIe power adapters melting. It’s supposedly caused by a suboptimal soldering job on the wires inside the plug that goes into the card. The hullabaloo caused an AMD executive to confirm on Twitter it wasn’t using the 12-pin connector.

Sadly, we must report that although AMD is announcing these GPUs on Nov. 3, it might be another month before they reach retail. Noted tipster Greymon55 recently posted that they won’t go on sale until December. They were originally supposed to be offered two weeks after the reveal, in late November. Pricing and board power are still TBD, but we shall find out soon enough.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/d4aU5xy

NEWS TECHNOLOGIE

The Perseverance rover is a well-equipped robot with a gaggle of cameras, multiple spectrometers, and even a little box that makes oxygen. You can’t get every possible scientific instrument on a Mars-bound rover, though. To really understand the red planet, we need to get samples back to Earth, and Perseverance is preparing to take the next step in making that happen. NASA and the ESA have agreed on a location for Perseverance to deposit the first sample cache, which could be retrieved a few years down the line in the NASA-ESA Mars Sample Return Campaign.

During its time on Mars, Perseverance will analyze numerous samples with the tools at its disposal, but the team is also carefully curating a collection of samples that will come back to Earth. The rover was designed with an innovative sample caching system, which packages up rock cores in pristine metal tubes that will protect them from contamination on the return journey. So far, Perseverance has collected 14 rock-core samples in the tubes. The robot has several dozen sample tubes at its disposal.

NASA and the ESA have agreed that the first batch of samples will be deposited at a site known as Three Forks, near the base of the ancient river delta in Jezero Crater. The mission to retrieve the samples is still evolving, so the team has to make some guesses about how the Return Campaign will work.

A few months ago, the agencies updated the plan to drop a second rover that was supposed to fly with the return vehicle. Now, Perseverance will be the primary means of getting samples to the Mars Ascent Vehicle (MAV). It will also have a pair of helicopters based on Ingenuity’s wildly successful design. The tube cache at Three Forks will act as a backup in the event Perseverance cannot rendezvous with the MAV or an issue pops up in the sample caching system.

While Perseverance drops off its first collection of samples, engineers back on Earth have begun the process of testing hardware for the return campaign. In what is known as “Phase B,” the team is working to develop prototypes that will eventually become the final flight hardware, which will hopefully have no defects or software glitches. There’s enough that can go wrong without hardware failures. After landing in Jezero Crater, the MAV will deposit the recovered tubes into a rocket that blasts them into orbit. At that point, an ESA spacecraft will have to pick them up and make its way back to Earth. If all goes as planned, the samples could be back on Earth as soon as 2033.

Now read:



from ExtremeTechExtremeTech https://ift.tt/NvajXzM

الجمعة، 28 أكتوبر 2022

NEWS TECHNOLOGIE

Welcome back, space fans, first-time readers, and all folks in between. The first few days of the week were a bit quiet, space-ifically speaking, but there were some late-breaking items of interest, especially for fans of InSight and Mars exploration in general.

One of the Seismic Events InSight Detected Was Actually a Meteor Strike

Since landing on Mars NASA’s InSight probe has detected the first seismic event on another planet, improved our understanding of Mars’ crust, mantle, and core, measured the remnants of Mars’ magnetic field, and collected more data on Martian weather than any other probe sent to Mars thus far. The mission is set to end at the end of December 2022, but an announcement NASA made on Thursday once again proved the value of InSight’s mission.

On December 24, 2021, InSight detected a magnitude 4 marsquake. After analyzing the seismic data and combining InSight’s measurements with data gathered from the Mars Reconnaissance Orbiter (MRO), NASA has determined that this specific marsquake was actually caused by a meteor impact. While the impactor was modest (16 – 39 feet), it’s still one of the largest crater formation events NASA has ever witnessed in real time. NASA has published before-and-after photos of the impact site:

“This meteoroid impact crater on Mars was discovered using the black-and-white Context Camera aboard NASA’s Mars Reconnaissance Orbiter. The Context Camera took these before-and-after images of the impact, which occurred on Dec. 24, 2021, in a region of Mars called Amazonis Planitia.” Credits: NASA/JPL-Caltech/MSSS

“It’s unprecedented to find a fresh impact of this size,” said Ingrid Daubar of Brown University, who leads InSight’s Impact Science Working Group. “It’s an exciting moment in geologic history, and we got to witness it.”

The December quake was the first marsquake observed to have surface waves. As the name implies, surface waves are seismic waves that travel primarily through the surface of the planet. While we detected seismic waves on the moon while the Apollo seismometer network was active, this is the first time we’ve detected seismic surface waves on any body other than Earth before. And in this case, the speed of the surface waves was much higher than what InSight had observed before from the rocks immediately under its own landing site.

Researchers are hoping to use this data and information collected from several other relatively energetic marsquakes to better understand the deep structures of the Martian crust. The northern and southern hemispheres of Mars are dramatically different. This has driven speculation as to the deep composition of the planet underneath each hemisphere. The information we gathered from InSight, once fully analyzed, could help us understand how different materials are distributed in the Martian crust.

Spare InSight Hardware Headed for Luna

InSight’s mission may be all but over, but there are plans to use some of its spare hardware in a similar mission to the moon in the near future. InSight’s seismometer is named SEIS (Seismic Experiment for Internal Structure). When the French space agency CNES and the Institut de Physique du Globe de Paris (IPGP) built SEIS, they also built a spare model of the hardware that remained on Earth. After InSight’s success, this spare hardware has been modified for a mission to the far side of Earth’s moon.

There’s definitely a Nazi base on the dark side of the moon. I saw it in a movie, so it must be true…

The SEIS Very Broad Band (VBB) seismometer will be one component of the Farside Seismic Suite and should be ready for lunar deployment by 2025 via NASA’s Commercial Lunar Payload Service program. It’s been decades since NASA had functional seismometers on the Moon; the Apollo network for monitoring seismic activity shut down in the late 1970s and we’ve never had seismometers on the far side of the moon at all. If Nazis really are secretly tunneling through the dark side of the moon in pursuit of our precious bodily fluids, projects like this one will make sure we know about it.

“The originality of the Farside Seismic Suite is that it will be independent of the lander,” Gabriel Pont, Farside Seismic Suite director at CNES, told Ars Technica. “That’s because it has to survive several lunar days and nights, which is not the case for the lander. The Farside Seismic Suite will have its own solar panels, antennas to talk with the orbiters, and its own thermal control devices.”

The Great Impact Hypothesis is currently the leading explanation for how the moon formed by a wide margin, but there are still substantial questions about the event. A better understanding of the moon’s composition, structure, and geological activity could help us answer questions about where it came from in the first place.

ISS Forced to Maneuver (Again) to Avoid Space Debris

Earlier this week, the International Space Station was once again forced to maneuver to avoid being struck by space debris created by Russia’s anti-satellite weapons test in November 2021.

The Russian military’s performance in Ukraine over the past year offers a partial explanation for why Russia chose to destroy Cosmos 1408 with a missile, but the laws of physics are not particularly concerned with human concepts like nationalism. Orbital space junk is a danger to the long-term usefulness of space and to humanity’s common ability to explore the cosmos. Kessler Syndrome might not prevent us from using satellites in varying orbits, but it makes absolutely nothing better.

The debris distribution around earth as of 2019. Data from NASA simulation.

Since Russia’s test last year, the United States has committed to banning such testing within our borders and has called on other nations to make similar pledges. Thus far, Canada, Germany, Japan, New Zealand, South Korea, and the UK have all pledged not to perform anti-satellite weapons tests of their own.

NASA Announces Its UFO ‘Unidentified Aerial Phenomena’ Investigation Dream Team

NASA has announced the 16 individuals who will participate in its independent study to understand and characterize sightings of what it calls Unidentified Aerial Phenomena, or UAP. The organization defines a UAP as “Observations of events in the sky that cannot be identified as aircraft or as known natural phenomena.”

Official discussion of UAPs and UFOs has been up sharply this year, as my colleague Adrianna Nine noted earlier this week, writing:

“The team is impressively diverse: Two astrophysicists, two policy specialists, two aviation specialists, an oceanographer, an AI startup founder, a science journalist, a planetary scientist, a former NASA astronaut, a telescope scientist, a space infrastructure consultant, an electrical and computer engineer, and a physicist each made the cut.”

NASA has posted a brief bio of each team member, and it gives the impression of a serious-minded bunch.

The US government may not devote a great deal of time to chasing reports of little green men, but there are UAP reports that remain unexplained to this day and some of them are backed up by active radar returns that confirm there was something beyond a merely visual phenomenon. There are obvious national security reasons why Uncle Sam would like to move these events into the “known knowns” category.

Skywatchers Corner

This week, we’re on location in the Adirondack Mountains of New York. Between the mountain tops and the dark night skies, the Adirondacks are one of the best places in the United States for skywatching. Orion might just be my favorite constellation, and it’s low in the sky as night falls. Brilliant Betelgeuse shines crimson from the mighty Hunter’s shoulder. Thursday evening, it was cold and quite clear, and we caught a few meteors — probably from the Orionids.

The Orionid meteor shower actually comes from the debris trail of Halley’s Comet. As the Earth sweeps out its spiraling trail through space, it passes through the comet’s path, and some of the debris gets caught in our atmosphere. But the meteor shower gets its name from the place in the sky from which it appears to originate. The Orionids appear to radiate from a point near the constellation Orion.

Cold, clear weather is great for skywatching. And while the Orionids technically peaked last week, they’ll be going strong all through November. This shower often creates long, streaking fireballs. So, while you’re outside looking at the constellations and the ‘backbone of night,’ you might just get lucky and see a shooting star.

All through this summer and fall, Mars, Jupiter, and Saturn have kept up their own colorful show. It’s said that planets don’t twinkle like stars, because planets are so much closer that the atmosphere doesn’t distort their light quite as badly. And this week, if you have clear skies, you might take a moment after sunset to look for the jewel-like colors of these three planets. (They’re especially striking at altitude because you can literally get above the clouds.) It’s a great demonstration of the planets’ steady light.

But more than a physics demonstration, it’s just plain cool. Think of it: you’re looking at a whole ‘nother planet. They’re so far away! But you can still see them. Here we are looking out, through space, into the outer Solar System, and you can see entire other worlds with your naked eye. It’s just a marvelous thing. So get outside and look up!

Now Read:



from ExtremeTechExtremeTech https://ift.tt/476q2xo

NEWS TECHNOLOGIE

It finally happened — after months of snide public statements and legal wrangling, Elon Musk has acquired Twitter for the original $44 billion offer. Musk wasted no time firing several executives, including CEO Parag Agrawal, almost the instant the deal was done. Musk has hinted at some major changes to the way Twitter works, including the end of lifetime bans, but he promises it won’t become a “free-for-all hellscape.” It may not be possible to thread that needle, though.

According to Musk, who has a penchant for grandiose overstatements, he purchased Twitter because he wants to help humanity. Also among those immediately fired by Musk was Twitter’s head of policy, trust, and safety, Vijaya Gadde. Musk has harshly criticized Twitter’s handling of numerous disputes, which sometimes involved temporary and lifetime bans on controversial figures. Reports now indicate that Musk plans to end the practice of permanently banning users.

Conservative users, who largely believe Twitter discriminates against them, are cheering the news. Musk has spent the past few years adopting right-wing talking points around COVID and criticizing Democrats, so they do have reason to be hopeful. Should Musk follow through on his apparent intention to reverse lifetime bans, that could mean the return of former president Donald Trump, who was permanently banned from Twitter following the January 6th Capitol riot. Trump has since launched his own social network known as Truth Social, but it’s apparently floundering. Returning to Twitter is probably what he’s wanted all along, anyway.

Musk says he plans to make some changes to the company’s suspension policy quickly, so accounts banned for harassment and misinformation could reappear soon. In addition to Trump, conservative political actors on Twitter are begging Musk to reinstate the accounts for Infowars and its founder Alex Jones, as well as Project Veritas, Milo Yiannopoulos, and election conspiracy theorist Mike Lindell.

We might soon get a preview of how, if at all, Musk will attempt to reign in misinformation on his new social network. As the deal closed, news broke of a break-in at the home shared by US Speaker of the House Nancy Pelosi and her husband Paul Pelosi. While the Speaker was in Washington at the time, reports say Paul was badly injured and is currently recovering in the hospital. Conservatives on Twitter have started alleging this attack was a false flag aimed at influencing the upcoming election — they have compared Pelosi to actor Jussie Smollette, who was convicted of concocting a fake assault story several years ago. It’s trending right alongside news of the break-in. Should this become a new election conspiracy theory, the old Twitter would have taken action, but will Musk?

Right now, it sounds like Musk wants to have it both ways. He’s committed to running Twitter as a “digital town square” that allows diverse opinions, but he also says it must be “warm and welcoming to all.” Regardless of what Musk says during this pivotal time for the company, it seems inevitable that Twitter will spend more time targeting bots than misinformation. Some people may prefer that, but it could make the site, if not a hellscape, at least a lot more combative. Musk got himself into this mess, and he’ll be blamed for whatever happens.

Now read:



from ExtremeTechExtremeTech https://ift.tt/hyKeLfS

NEWS TECHNOLOGIE

(Photo: Isaac Burke/Unsplash)
Soon the natural light filtering through your window could do more than just brighten up your space. Scientists have achieved a level of efficiency for dye-sensitized solar cells (DSCs) that might enable the creation of energy-generating windows.

In a paper published this week in the journal Nature, researchers from Switzerland’s École Polytechnique Fédérale de Lausanne detail the way in which they helped DSCs harvest energy from the full visible light spectrum. DSCs, a type of low-cost, thin film solar cell, use photosensitized dye attached to the surface of a wide band gap semiconductor to convert visible light into energy. Despite their financial and physical practicality, they’re not as efficient as conventional solar cells, which delegates both light absorption and energy generation to the semiconductor. This means that even though energy-generating windows have technically been possible for a while, the devices wouldn’t have been worth the resources.

This new efficiency record could change that. The team in Switzerland enhanced DSCs’ efficiency by meticulously controlling the assembly of dye molecules on the cells’ nanocrystalline mesoporous titanium dioxide (TiO2) films. Pre-adsorbing a single layer of hydroxamic acid derivative on the film’s surface allowed the scientists to improve the molecular packing and performance of two custom-designed sensitizers. These sensitizers were found to be capable of harvesting light from the entire visible spectrum.

Dye-sensitized solar cells. (Image: Ronald Vera Saavedra Colombia/Wikimedia Commons)

During a simulation of standard air mass 1.5 sunlight—the air mass coefficient typically used to measure solar cells’ performance—the enhanced DSCs achieved a power conversion efficiency (PCE) of 15.2 percent. Considering the fact that 12.3 percent was the best-known DSC PCE in 2019, that figure is impressive, especially when you factor in that the enhanced cells maintained operational stability over 500 hours of testing. Better yet, when the scientists tested their enhanced DSCs on devices with a larger active surface area, they achieved a groundbreaking PCE range of 28.4 to 30.2 percent.

The team believes the enhanced DSCs could pave the way for energy-generating windows, skylights, and greenhouses in the near future. They could even find a place in low-power electronic devices, which would then use ambient light as an energy source.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/KmR3VNs

NEWS TECHNOLOGIE

Call of Duty is one of the most popular video game franchises in history — Sony has pointed out, as it tries to keep Microsoft from acquiring Activision Blizzard. CoD could almost be considered a genre all by itself. Naturally, the release of a new game in the series is a big deal, but as gamers excitedly tear into their physical copies of Call of Duty: Modern Warfare II, they’re finding a nasty surprise. Instead of the game, discs have just 72MB of data and the rest must be downloaded.

The disc versions of the game appear to be functionally identical to digital copies. Instead of buying the license and downloading it, you purchase a disc that ships to you with a license key. Pop that in your console, and you don’t get to play the game right away — you have to download the game just like those who bought digital codes. So, the production and shipping of these discs is a complete waste of time and resources.

Call of Duty: Modern Warfare II is a roughly 35GB game, which is small enough to fit on modern Blu-Ray game discs. However, with the addition of a day-one patch, MWII balloons to a ridiculous 150GB on the PS5. Ignoring for a moment how silly it is to ship a disc with nothing on it, it’s unacceptable that the publishers didn’t tell people this is what they were buying. There are good reasons people might want the disc version of a game.

Playing games via physical discs isn’t as common these days as it used to be, but it provides an important option for people who don’t have enough internet bandwidth to download dozens of gigabytes. Many ISPs also cap data usage, so you may not want to download enormous games. In the US, which generally has higher data caps than many countries, Comcast caps residential users to 1.2TB per month. Downloading Modern Warfare II would eat up more than 10% of that allotment, even if you went out of your way to buy a disc version of the game. Plus, a disc is more likely to work in the absence of online services, which publishers like to shut down to save money once a game is no longer popular. It appears none of those advantages matter to Activision.

Now read:



from ExtremeTechExtremeTech https://ift.tt/AiGUkqd

NEWS TECHNOLOGIE

Tesla has leaned into autonomous driving like few other automakers with its Autopilot system, which has been a core feature of its vehicles since 2013. However, that push could get the company in trouble. A new report says that the US Department of Justice has opened a criminal investigation of Tesla following a series of crashes and deaths related to Autopilot.

Initially, Tesla required customers to pay extra for Autopilot, priced between $5,000 and $8,000, depending on features. As it expanded to produce less expensive vehicles, Tesla included the basic Autopilot feature set for no additional charge. As more drivers started using Autopilot, we started to see reports of accidents where Autopilot was in complete control of the car. Tesla has since expanded the “Full Self-Driving (FSD)” features of its vehicles to make the cars more reliable and able to drive themselves in more situations. And yet, Autopilot is not true self-driving.

At issue is the way Tesla advertises and discusses Autopilot. While the company’s more careful disclaimers note that drivers have to keep their hands on the wheel, anyone who has driven in a Tesla knows that the vehicle will often let you zone out for long periods of time without any nudges. And then there’s the way Tesla CEO Elon Musk talks about Autopilot. A promotional video on Tesla’s website features Musk saying that the driver is only there “for legal reasons,” and “the car is driving itself.”

Overhead signs present a challenge for autopilot systems

Reuters reports that the DOJ investigation started last year and could be a more serious threat to Tesla than the various state-level investigations already pending. The case could result in criminal charges against individual executives or the company as a whole, sources have said. However, charges would most likely require that evidence of intentional misrepresentation is uncovered in the probe. If not, Tesla can always point to its disclaimers as legal cover, even if Musk is out there making wild claims about Autopilot’s capabilities. The National Highway Traffic Safety Administration is also investigating crashes in which Teslas were in Autopilot mode.

Tesla does not have a media office — it only has Elon Musk, who has been too busy closing his Twitter acquisition to tweet any statements about this report. This comes as Tesla has been paring back the sensors in its cars, which has made some Autopilot features unavailable as the company works to update the system to rely solely on camera input. Tesla is not alone in struggling to perfect self-driving technology. After years and billions of dollars, big players like Google and Uber are still struggling to make vehicles that can drive as well as humans.

Now read:



from ExtremeTechExtremeTech https://ift.tt/jTzQoe6

NEWS TECHNOLOGIE

NASA’s InSight Mars mission is winding down, and while it never managed to get the burrowing heat probe to work, InSight is still a huge success thanks to its groundbreaking seismometer. Now, the first seismometer to operate on another planet is making history again. Spare parts from the Seismic Experiment for Internal Structure (SEIS) will form the basis for a seismic instrument that will make its way to the far side of the moon in 2025.

SEIS was designed and developed by the Institut de Physique du Globe de Paris (IPGP) and the French CNES space agency. Work began back in the 90s, and eventually the project was chosen to fly on InSight, which reached Mars in 2018. As part of the development process, engineers built a duplicate seismometer that is still on Earth. Parts of this device will be integrated into the Farside Seismic Suite (FSS) that NASA plans to deploy to Schrödinger crater on the far side of the Moon.

SEIS (above) featured three ultra-sensitive pendulums spaced 120 degrees apart, allowing it to detect movement in any direction as little as 10 picometers. That’s smaller than the width of a single atom. This incredible precision allowed NASA to record hundreds of Marsquakes, far more than scientists expected to detect. For the FSS, one of the backup SEIS pendulums will become the Very Broad Band (VBB) seismometer for the mission, which will measure vertical ground vibrations. A second instrument known as the Short Period Seismometer (SPS) will monitor movement in other directions.

The existing SEIS hardware was already a good match for the proposed lunar application, according to Gabriel Pont, who manages the FSS project at CNES. “The Farside Seismic Suite seismometer will be tuned for lunar gravity. It will be placed in a vacuum protection case called seismobox,” Pont told Ars Technica. The team expects the 40-kilogram FSS lander to have similar sensitivity to SEIS on Mars, making it about 10 times better than the last seismometers deployed on the moon during the Apollo program.

NASA has awarded the contract for transporting the Farside Seismic Suite to Draper Laboratory. The lander (see above), is just the vehicle for getting the FSS to the surface. The instruments will be independent of the lander with their own solar panels, communication, and heaters. To save power, the FSS will not transmit data during the lunar night, but it will connect to an orbiter while in sunlight to upload data. NASA is paying $73 million to Draper for this landing under the Commercial Lunar Payload Services program, which is currently set for May 2025.

Now read:



from ExtremeTechExtremeTech https://ift.tt/BK61Mzm

NEWS TECHNOLOGIE

(Photo: /u/NoDuelsPolicy on Reddit)
This week we’ve seen more reports of fried RTX 4090 adapters, with the count now up to a half dozen or so. Previously we just knew that some adapters were melting, and in doing so also damaging the connector on the card. It wasn’t clear exactly what was to blame for this situation. Was it the adapter, the bending of it, or some other engineering mishap? Now the intrepid tinkerers at Igor’s Lab have revealed what they believe is the culprit: the Nvidia-designed 4-into-1 adapter. This takes four 8-pin cables and routes them into a single 16-pin plug. Its poor construction is likely the cause of the issue, and photos reveal it to be a hot mess, pardon the pun.

Previously all we knew was that adapters were melting. It was theorized it was somehow related to bending the adapter, which is required due to the size of the GPUs. Their width places the connector close to most cases’ side panel, necessitating an almost 90-degree bend to move the cable out of the way. This isn’t rocket science; you don’t want to put a serious bend on any electrical connection. This might not be an issue if the connection was as solid as a rock. However, it turns out it’s not very solid at all. This is seemingly due to insufficient soldering around the wires, which is meant to keep them in place. Igor’s Lab took one of the adapters apart to investigate, and the results aren’t pretty.

The exposed connector looks like a lab experiment, not something provided with a $1,600 GPU. (Image: Igor’s Lab)

According to his investigation, there are six contact patches and four wires on each side of the adapter. Two on the edges are soldered to one point each, and the four in the middle are connected to two wires each. You can see in the photo there’s a surprisingly small amount of solder used for these connections. Igor says the connections use a 0.2mm copper base with 2mm of solder per wire. That leaves 4mm of solder for the twin cable connections.

For 14 gauge wire, that’s not much. Now just imagine bending those wires at a right-angle while they are hot, and you can see the problem. This could cause the solo cables at the edge to get loose first. Unsurprisingly, that’s exactly what we’ve seen in several of the reports so far (see below). When that happens all the current will flow through the remaining wires, so they will heat up drastically.

A melted adapter via Reddit.

To summarize the findings, the problem is not with the actual cables coming from the PSU. Nor is it with the connector on the PCB. It seems to lie exclusively with the adapter designed by Nvidia. This is included with every Ada GPU and was made by a third party for Nvidia. Igor assumes Nvidia wasn’t aware of how poorly they were made, or it didn’t examine them too closely. If it had, it would never have allowed its add-in board (AIB) partners to include them with its flagship GPUs.

For now, Nvidia is still investigating according. In an update, Igor’s Lab says Nvidia has reached out to its board partners and asked to have all damaged GPUs sent to HQ for analysis. The next logical step would be for Nvidia to announce a recall for the adapters. It should have new adapters made, and replace them for free. That could take some time, obviously, and will seriously piss off current RTX 4090 owners. However, those gamers will likely prefer a safer adapter in the long run. They could alternatively purchase an ATX 3.0/PCIe Gen 5 power supply, but those are still hard to find. Plus, they will be expensive. Some companies like Seasonic and CableMod are also offering 90-degree adapters as well, but they’re not for sale yet.

Seasonic’s upcoming 90-degree 12VHPWR cable. (Image: Bilibili)

Until we hear from Nvidia officially, our advice is pretty simple: be careful with your adapter, and don’t bend it. If it’s currently bending due to your case’s side panel, take off the side panel.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/umSe2BC

الخميس، 27 أكتوبر 2022

NEWS TECHNOLOGIE

In the past few decades, astronomy has gone from speculating about the existence of planets outside our solar system to identifying more than 5,000 of them. Now, the hunt for habitable exoplanets is on, but we may have a smaller pool of possible candidates, according to a new study from researchers at the University of California, Riverside. After analyzing a nearby exoplanet, the team has concluded it is unlikely that the most common type of star in the Milky Way is capable of supporting life as we know it.

The study focused on an exoplanet called GJ 1252b, which orbits an m-dwarf star (sometimes called a red dwarf) just 66 light years away. That’s right next door in astronomical terms. These small, long-lived stars are so numerous that discovering habitable planets around them could mean a much higher chance of finding extraterrestrial life.

However, there’s one big problem: An exoplanet close enough to a red dwarf to have liquid water would also be subjected to intense radiation and unpredictable solar flare activity. Could such a world even maintain an atmosphere? The proximity of GJ 1252b provided the Riverside researchers with a chance to find out one way or the other.

Astronomers measured infrared radiation from GJ 1252b during a so-called “secondary eclipse,” when the planet passes behind a star that blocks both its light as well as light reflected from its star. The radiation readings suggested the exoplanet, which completes an orbit of its host star in just 12 Earth hours, has a surface temperature of 2,242 degrees Fahrenheit (1,226 degrees Celsius). That’s hot enough to melt gold, silver, and copper. This led the researchers to conclude that GJ 1252b does not have an atmosphere.

The Trappist-1 system consists of several rocky planets orbiting an m-dwarf star. The Webb telescope will soon conduct new observations of these worlds.

Further, the team calculated what it would take for the exoplanet to have any atmosphere whatsoever in the face of such intense solar activity. Even with carbon levels 700 times higher than Earth, they estimate GJ 1252b would still have its atmosphere stripped away. “It’s possible this planet’s condition could be a bad sign for planets even further away from this type of star,” says Riverside astrophysicist Michelle Hill.

This doesn’t mean habitable planets are out of the question, though. While most of the 5,000 stars near Earth are m-dwarfs, about 1,000 are similar to the sun and could potentially host Earth-like planets. And the universe is a big place — there are always more stars to survey, and new instruments like the James Webb Space Telescope will help us observe them in greater detail than ever before.

Now read:



from ExtremeTechExtremeTech https://ift.tt/2Roqfrk

NEWS TECHNOLOGIE

Windows 11 brought a raft of new features like a revamped start menu, more control over snap layouts, and integration with more Microsoft apps and services. One of the biggest additions was the Windows Subsystem for Android, which allows your PC to run Android apps. The subsystem rolled out as a preview, but it’s already left that caveat behind in the big Windows 11 2022 update. Microsoft isn’t resting on its laurels, either. It already plans to improve the subsystem with a move to the latest Android 13 OS.

The Windows Subsystem for Android (WSA) is a virtual machine built into Windows, and it’s currently based on Android 12L (an upgrade over the original Android 12 distribution). Android is open source, so we have more insight into what Microsoft is doing with this feature than we do with other aspects of Windows 11. Over on the WSA GitHub page, Microsoft has posted a roadmap detailing the features it has added so far, along with the ones it plans to implement in the future. And right at the top of the list is Android 13.

Android 13 launched just a few weeks ago on Pixel phones, and Samsung only started rolling out the update to some versions of the Galaxy S22 in late October. It’s not the biggest update — most of the user-facing changes come in the form of enhanced theme support, which won’t matter on Windows, but the system optimizations and API updates are essential to future functionality. Microsoft’s apparent drive to keep the WSA updated is refreshing. Many Android-on-Windows projects have fallen behind, making it frustrating to run Android apps on a PC.

In addition to updating the WSA to Android 13, Microsoft plans to implement file transfer support, making it mercifully easier to move files between the Windows and Android parts of the system. Microsoft will also add default access to the local network, another way to make the Windows Subsystem for Android less compartmentalized.

Another planned addition is shortcuts, an Android feature that lets apps link to specific functionality. For example, a messaging app can provide a shortcut to a frequent contact that saves you from navigating to the conversation manually every time. There’s one more convenience feature on the way: picture-in-picture mode. That means Android media apps on Windows will be able to draw over the top of the Windows UI.

The roadmap doesn’t include any proposed dates or even a general timeline for these improvements. Microsoft was only a few months behind implementing Android 12L, and it recently opted to change the way it releases updates for Windows. No more will we have to wait for major semi-annual releases to get new features. So, we can expect the WSA enhancements to arrive whenever they’re ready.

Now read:



from ExtremeTechExtremeTech https://ift.tt/fbT9mAQ

NEWS TECHNOLOGIE

(Photo: Mishaal Zahed/Unsplash)
At long last, Apple is bringing USB-C to the iPhone—it’s just not exactly thrilled about it. During a media interview Tuesday, an Apple marketing executive confirmed the company would be complying with the European Union’s new law requiring all mobile devices to charge via USB-C.

The confirmation came during The Wall Street Journal’s Tech Live conference this week. Reporter Joanna Stern sat onstage with senior VP of global marketing Greg Joswiak and software VP Craig Federighi to discuss Apple’s overarching product innovation philosophy. Stern asked the duo how Apple planned to approach the EU’s USB-C requirement, which was solidified this month after a year of legislative work.

“Obviously we’ll have to comply, we have no choice,” Joswiak said. “We think the approach would’ve been better environmentally, and better for our customers, to not have the government be that prescriptive.”

From an environmental standpoint, Joswiak worries the EU’s new requirement (which will take effect in 2024) will result in the same amount of e-waste it aims to prevent. iPhone users who already own Lightning cables will have to dispose of those cables as soon as they upgrade their phones. Considering Joswiak’s point that over a billion people own iPhones—and that many iPhone users have more than one charger—it’s possible that 2024 will see a drastic uptick in cable disposal.

Software VP Craig Federighi (left) and senior VP of global marketing Greg Joswiak (right) onstage at WSJ’s Tech Live conference.

But will all of us be forced to ditch our Lightning cables, or just those upgrading within the EU? Joswiak refused to say whether Apple would be switching to USB-C on all of its future mobile devices or only those it would sell across the pond. There are some customers outside the EU who’d prefer that all iPhones charge via USB-C; after all, iPads now use USB-C, so why not go all in? On the other hand, some likely share Joswiak’s sentiment that a port change would be inconvenient and wasteful.

“I don’t mind governments telling us what they want to accomplish,” Joswiak said during the interview. He pointed to times in which mobile phone manufacturers have been forced to adopt specific hearing aid compatibilities that ended up failing more often than not. Joswiak expressed that he’d rather government entities allow tech companies to find their own ways of meeting collective goals rather than prescribing restrictive methods.

Joswiak’s concerns might stem from design autonomy, but they also might be the result of Apple’s desire to enforce brand loyalty. Lightning cables are only compatible with Apple products—they can’t be used with any other manufacturer’s devices. Once a person has invested in Apple infrastructure, they might find it too inconvenient to switch to, say, a Samsung or Google smartphone. (The same concept can be seen in Apple’s commitment to keeping Android users out of iMessage.) By switching to USB-C, iPhone users within the EU are one step closer to potentially abandoning their brand loyalty and dumping Apple for good…and from a monetary standpoint, why would Apple want that?

Now Read:



from ExtremeTechExtremeTech https://ift.tt/31quAjZ

NEWS TECHNOLOGIE

The International Space Station

NASA has announced that the International Space Station (ISS) executed an orbital course correction on Monday. The ISS rarely needs to adjust its orbit, but it was necessary in this case to avoid a piece of Russian space debris. The agency says the dangerous bit of junk came from the Cosmos 1408 satellite, which Russia destroyed in an anti-satellite-weapons test in late 2021.

The crew fired the engines on Progress 81 (fittingly, a Russian cargo vessel) docked at the station for about five minutes (305 seconds). Following the maneuver, the station’s apogee (highest point in its orbit) was raised by 0.2 miles. Its perigee (the lowest point) was elevated by 0.8 miles. That was enough to steer well clear of the debris, which was projected to pass within three miles of the station. Even a small piece of space junk could seriously damage the station and risk the lives of astronauts due to its high speed relative to the ISS.

NASA says the Pre-Determined Debris Avoidance Maneuver (PDAM) on Monday evening did not impact space station operations. However, this may become a more common operation as the orbit around Earth becomes increasingly crowded. The testing of anti-satellite weapons certainly isn’t improving matters either. Russia’s decision in November 2021 to test its anti-satellite weaponry on its own satellite produced over 1,500 pieces of trackable debris, all of them potentially dangerous to space operations.

NASA played it straight when announcing the course correction, leaving its probable frustration with the Russians unsaid. The agency got about as spicy as it could when commenting on the destruction of Cosmos 1408. After the test, Russia was roundly criticized by the world’s space agencies, including NASA, which called the test “dangerous and irresponsible.” More recently, Russia has threatened to use similar weapons against SpaceX Starlink Satellites, which are providing connectivity to Ukrainian military forces in their war with Russia.

As illustrated by this maneuver, an escalation of space warfare that involves picking off satellites could put the ISS and other missions at grave risk. That’s one of the reasons the US proposed an end to orbital weapons tests several weeks ago. It’s unlikely Russia would agree to such a ban — it’s already leaning away from working with other space agencies. Earlier this year, Russia’s Roscosmos announced that it would pull out of the International Space Station after 2024 so it could focus on building its own station.

Now read:



from ExtremeTechExtremeTech https://ift.tt/CvMBwxF

الأربعاء، 26 أكتوبر 2022

NEWS TECHNOLOGIE

(Photo: Library of Congress/Unsplash)
Though we haven’t yet figured out how to actually predict earthquakes, a system based in California aims at giving people in earthquake-prone regions a few extra seconds to take cover. ShakeAlert is an earthquake early warning (EEW) system that initiates urgent mobile phone alerts at the start of a quake—and yesterday’s 5.1 magnitude event near San Jose proved it works as intended.

ShakeAlert first came on the scene in 2019. The US Geological Survey (USGS), California Geological Survey, California Governor’s Office of Emergency Services, and several California universities had been working together for over a decade to create an EEW system that could alert people on the west coast to impending earthquakes. When ShakeAlert first launched, it only covered California; two years later, it expanded to include Oregon and then Washington.

The system works by using geographically distributed seismic sensors to detect two types of waves: fast-moving compressional waves and slower-moving shear waves. The sensors send these signals to ShakeAlert’s data processing center. Once the processing center receives signals from four separate sensors, it prompts the USGS to initiate an alert. If the sensors receive stronger signals as the earthquake continues, the USGS will update the quake’s magnitude accordingly. Though ShakeAlert is best known for its mobile notifications, its alerts can also be distributed via radio, television, public siren, and the Federal Emergency Management Agency (FEMA) wireless emergency alert system (most recognizable for its dissemination of amber alerts).

Around the same time that it started serving Oregon and Washington, ShakeAlert sought to enhance its offering by looking at the way people respond to earthquakes on their smartphones. The service partnered with Google. During the partnership’s initial stage, ShakeAlert and Google improved their delivery of earthquake notifications on Android phones. Then they moved on to sourcing data from mid- or post-quake Google searches. The idea is that when several people search for things like “earthquake Los Angeles,” Google can use those search locations to help determine the earthquake’s spread.

It isn’t clear exactly how much progress ShakeAlert and Google have made on that front, but when a 5.1 M earthquake started in Santa Clara County, California on Tuesday, many smartphone users received notifications on their phones before beginning to feel the ground shake. “Got earthquake early warning in Daly City just before I felt the shaking. Earthquake early warning says it was magnitude 5.1 in Santa Clara County,” tweeted LA Times reporter Ron Lin shortly after the event ended. “Looks like I was 52 miles northwest of the epicenter. I thought the MyShake ShakeAlert warning was a false alarm lol, and then I felt the shaking!” Another Twitter user said they received a ShakeAlert notification seven seconds before they started feeling the quake itself.

Though a seven-second headstart might not sound significant, it can make all the difference to those in an earthquake’s radius. The gap allows people to duck under desks, steer clear of large trees, and otherwise seek cover, hopefully preventing serious injuries. The notifications are also an important facet of infrastructure protection. A well-timed alert can help trains slow down and avoid derailment, close water and gas valves to prevent utility disasters, and even inform firefighters to open firehouse doors before they can jam shut.

ShakeAlert is still under development, meaning not all smartphone users will receive earthquake notifications without taking additional action. Android users appear to be receiving warnings automatically, but ShakeAlert recommends that iPhone users download the MyShake app if they live on the west coast and would like to receive urgent notifications.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/jPLtnUv

NEWS TECHNOLOGIE

Those who spent the 90s and early 2000s tinkering with computers will remember an encounter or two with Memtest86. They won’t be good memories necessarily, because you only needed Memtest86 when something was busted. This invaluable tool has been unmaintained for almost a decade, but it has risen from the dead, not unlike many of the nonfunctional PCs it helped diagnose. The new Memtest86+ version 6.0 is finally available, and it’s still completely free and open source.

Tracking down the cause of system errors in a PC can be grueling and confusing, particularly when the issue is a bad stick of RAM. These intermittent errors are often even harder to track down, and that’s why Memtest86 was created in the 90s. It provides a bootable environment that takes all the other variables out of the equation, allowing you to test your RAM in isolation to either rule it out or confirm it as the cause.

Memtest86 appeared to die in 2013 when PassMark acquired the original application, even though the Memtest86+ fork was still generally available. Version 6.0 is the first update in the past nine years, and the program now has support for all the latest PC hardware, including DDR4 and DDR5 RAM. It’s hard to believe but DDR3 was the most recent standard back in 2013 when development petered out. The new version also adds XMP 3.0 support.

Corsair-Vengeance-RAM-feature

CPU support has been upgraded, as well. Memtest86+ now works with the latest Intel and AMD chipsets. It understands all of AMD’s Ryzen parts from the 1000 series all the way up to the newest Ryzen 7000. There’s also better support for pre-Zen AMD chips. On the Intel side, support goes all the way up to the new 13th Gen Core processors. The program supports up to 256 cores per machine, far more than any mainstream CPU currently has. There’s even, weirdly, support for some ancient Nvidia nForce chipsets in the new version.

You can get the new Memtest86+ release from the official website in both Windows and Linux flavors. The operating system requirement is just for the creation of boot media. The Windows version will make the bootable USB for you, but Linux users will be able to dump the raw ISO image on any suitable drive with the tool of their choice. If you want to check out the source code, it’s freely available on GitHub.

Now read:



from ExtremeTechExtremeTech https://ift.tt/8J7LmcY

NEWS TECHNOLOGIE

(Photo: JESHOOTS/Unsplash)
Given the recent prevalence of metaverse chatter, you’d think the new online space has a long, healthy road ahead. According to one analyst, this might not be the case. Matthew Ball, chief analyst at tech market research firm Canalys, recently shared that he believes most metaverse projects will shutter over the next couple of years.

Ball shared his thought process at Canalys’ 2022 Channels Forum in Barcelona, according to a new report from The Register. Addressing whether the metaverse could be considered “the next digital frontier” or “an overhyped money pit,” Ball said Meta’s convoluted project—and all its somewhat pitiful snags—can be considered a barometer for the metaverse’s success as a whole. He went on to say he believes most metaverse projects will have closed by 2025.

Ball’s analysis is in direct contrast with a prediction recently cited by Interpol, in which market research firm Gartner said a quarter of Americans will spend at least one hour in the metaverse per day by 2026. Meta itself has an even loftier hunch: Zuckerberg said earlier this year that he expects one billion people to be in the metaverse by 2030.

But from outside of the metaverse development space, it’s far easier to understand Ball’s gloomy prediction. Despite investing billions in the platform, Meta (by and large the biggest player in the metaverse space) has had a tough time getting its version of the metaverse onto its feet. Most have heard by now that Meta has to force its own employees to use its flagship metaverse product, Horizon Worlds. Meta employees’ hesitance to use Horizon Worlds at work makes sense, given recent confirmation that working in the metaverse totally sucks.

(Image: Meta)

Some big-name tech founders have rather explicitly (in more ways than one) dismissed the metaverse, saying it’s a disappointing product that eats up resources that could be used to fix real, existing problems—not just gratuitously create new ones. That aligns with Ball’s point regarding accessibility: In the midst of a cost-of-living crisis, few people are interested in or able to invest hundreds (if not thousands) of dollars in virtual spaces like the metaverse. “People are struggling in the real world, let alone in the virtual world, to be able to invest in property and items and other NFTs,” Ball said.

An undertaking as large and convoluted as the metaverse takes a lot of faith to pursue. Meta, along with tech giants like Microsoft, Apple, and Google, appear to be capable of sustaining that faith for now, even if it’s just to prove their recent investments have been worth it. If Ball is on the right track, it’ll only be a few more years before we find out whether that’s actually the case.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/9QJjfGq

NEWS TECHNOLOGIE

GPU “Cable Gate” is an official thing now. This week a Redditor posted images of a melted 12-pin adapter cable on their Nvidia RTX 4090. The power plug on the PCB was also irreversibly damaged as well. Naturally, the Internet was outraged as this was predicted to happen before the GPUs launched. Back in September PCI-SIG issued a warning about using 12-pin power cable adapters, saying it could cause problems. Now, that has apparently happened. In the wake of this event, it’s been confirmed that AMD will not be using the new cable on its upcoming RDNA3 GPUs. This has led to a lot of questions about Nvidia’s 12-pin cable design. Now Nvidia says it’s investigating the incident.

Backing up a tiny bit, the 12-pin power connector is not a new thing. Back when Nvidia launched its Ampere architecture, it also introduced a new 12-pin power connector. This change allowed for more room on the PCB, as it went from dual 8-pin connectors on high-end cards to a single 12-pin, mounted vertically. Nvidia used that extra space to improve cooling performance. The GPUs included 8-to-12-pin adapters, and all was well. However, things have changed with the introduction of the RTX 4090.

A new ATX 3.0 and PCIe 5.0 spec provides a single 12-pin cable (above) from the power supply to the GPU. This connector, dubbed 12VHPWR, differs from the Ampere cable; it has four additional signaling pins that communicate with the GPU. However, PSUs with the new cable aren’t readily available yet, so you have to use an adapter. For the RTX 4090, it’s typically a four-into-one adapter using 8-pin cables.

Previous reports suggested these adapters could cause issues with new, high-power GPUs when using ATX 2.0 power supplies and adapters. PCI-SIG sent a letter to its members warning them about this scenario. According to Wccftech, it stated, “Please be advised that PCI-SIG has become aware that some implementations of the 12VHPWR connectors and assemblies have demonstrated thermal variance, which could result in safety issues under certain conditions.” Thermal variance seems to be the key phrase here.

Redditor /u/reggie_gakill posted this photo of his melted cable and connector in /r/Nvidia.

Now a Redditor has posted photos of a melted cable and connector on an RTX 4090. In the same thread, someone posted a second image of a melted connector as well. Additionally, when YouTuber Jayztwocents made a video about this controversy a while back, Nvidia told him there was nothing to worry about. It said it’s done its testing and there were no issues.

As far as Reddit goes, a rep for Nvidia has apparently reached out to the Reddit user. Now its PR chief says the company is investigating. In a statement to The Verge, Bryan Del Rizzo said, “We are investigating the reports,” adding they are collecting information from Reddit users.

Following this imbroglio, it’s being reported AMD will not be using that cable at all for its upcoming RDNA3 GPUs. Instead, it’ll use a traditional dual 8-pin configuration, at least on the high end. That’s what sources have told Kyle Bennet, formerly of [H]ardOCP notoriety. This was then confirmed by an AMD SVP. This applies to AMD reference cards only. It’s unclear if partner boards from MSI and others will also jettison the controversial connector. It’s also unknown if a cable capable of pulling 600W will even be necessary for AMD’s next-gen.

After all, if AMD’s high-end card only requires two 8-pin connectors, that’s a 300W budget. Add in 75W from the PCIe slot and you’re looking at a sub-400W GPU, compared with Nvidia’s 450W card. It’s possible that AMD’s solution is much more efficient than Nvidia’s as well. That’s simply because it will be the first consumer GPU with a chiplet architecture. This stands in contrast to Nvidia’s enormous monolithic die for Ada Lovelace.

Still, it’s a stumble for Nvidia’s newest GPUs. It remains to be seen what is the exact culprit for the melting behavior. When the GPUs launched some reviewers called out the scary amount of bending required to tuck the cables away. When you bend the adapter down to hide the cables, it puts a lot of pressure on the connector attached to the PCB. This could pry its connections loose from the PCB over time, some predict. The issue is exacerbated by how large the GPUs are too, as it can place the connector close to the side panel. As an alternative, CableMod sells a 90-degree adapter to prevent flex at the connector. The other alternative is to buy an ATX 3.0 PSU, but they are still not as prevalent as ATX 2.0 models, which are ubiquitous.

(Image: CableMod)

For now, we will have to wait and see what Nvidia has to say when its investigation is completed. People spending $1,600 on a GPU don’t necessarily want to drop another $200+ on a new ATX 3.0 power supply, so the adapters are necessary. But if those adapters are causing issues, they might not have a choice.

Nvidia and its partners have surely shipped thousands of GPUs to customers, if not more, and so far there are only two published reports of melting occurring. Still, it’s concerning it has even happened once.

Now Read:



from ExtremeTechExtremeTech https://ift.tt/aiwPA4s